sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
| tokens_length
sequencelengths 1
353
| input_texts
sequencelengths 1
40
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
57fea7503582427f27934368663173b8674b5cb5 | rows 10m to 11m from the DSIR pile | georgeyw/dsir-pile-1m-2 | [
"region:us"
] | 2023-12-23T15:52:27+00:00 | {} | 2023-12-23T17:40:07+00:00 | [] | [] | TAGS
#region-us
| rows 10m to 11m from the DSIR pile | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
1a4336a67a82ac34a7dba0c370166ba59fc7c1d9 |
# Dataset Card for Evaluation run of Felladrin/Llama-160M-Chat-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Felladrin/Llama-160M-Chat-v1](https://huggingface.co/Felladrin/Llama-160M-Chat-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Felladrin__Llama-160M-Chat-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T16:11:07.691386](https://huggingface.co/datasets/open-llm-leaderboard/details_Felladrin__Llama-160M-Chat-v1/blob/main/results_2023-12-23T16-11-07.691386.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.261252659291333,
"acc_stderr": 0.030837999574303904,
"acc_norm": 0.2626209462127548,
"acc_norm_stderr": 0.0316573603453682,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.4416088801457481,
"mc2_stderr": 0.01524734599791119
},
"harness|arc:challenge|25": {
"acc": 0.2167235494880546,
"acc_stderr": 0.01204015671348119,
"acc_norm": 0.24744027303754265,
"acc_norm_stderr": 0.012610352663292673
},
"harness|hellaswag|10": {
"acc": 0.3123879705238,
"acc_stderr": 0.004625198756710239,
"acc_norm": 0.3529177454690301,
"acc_norm_stderr": 0.004769007545082275
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174023,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174023
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.02605529690115292,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.02605529690115292
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2847222222222222,
"acc_stderr": 0.037738099906869355,
"acc_norm": 0.2847222222222222,
"acc_norm_stderr": 0.037738099906869355
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749898,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749898
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993178,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993178
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436695,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436695
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.0345593020192481,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.0345593020192481
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.02264421261552521,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.02264421261552521
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3096774193548387,
"acc_stderr": 0.026302774983517414,
"acc_norm": 0.3096774193548387,
"acc_norm_stderr": 0.026302774983517414
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.18787878787878787,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.18787878787878787,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.26262626262626265,
"acc_stderr": 0.031353050095330855,
"acc_norm": 0.26262626262626265,
"acc_norm_stderr": 0.031353050095330855
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3160621761658031,
"acc_stderr": 0.03355397369686173,
"acc_norm": 0.3160621761658031,
"acc_norm_stderr": 0.03355397369686173
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3435897435897436,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.3435897435897436,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2815126050420168,
"acc_stderr": 0.02921354941437216,
"acc_norm": 0.2815126050420168,
"acc_norm_stderr": 0.02921354941437216
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23669724770642203,
"acc_stderr": 0.018224078117299085,
"acc_norm": 0.23669724770642203,
"acc_norm_stderr": 0.018224078117299085
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.20098039215686275,
"acc_stderr": 0.028125972265654386,
"acc_norm": 0.20098039215686275,
"acc_norm_stderr": 0.028125972265654386
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.029178682304842538,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.029178682304842538
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21076233183856502,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.21076233183856502,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3305785123966942,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.3305785123966942,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.19642857142857142,
"acc_stderr": 0.03770970049347018,
"acc_norm": 0.19642857142857142,
"acc_norm_stderr": 0.03770970049347018
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.026453508054040332,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.026453508054040332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.01576998484069052,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.01576998484069052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.023176298203992002,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.023176298203992002
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22346368715083798,
"acc_stderr": 0.013932068638579752,
"acc_norm": 0.22346368715083798,
"acc_norm_stderr": 0.013932068638579752
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.28938906752411575,
"acc_stderr": 0.025755865922632924,
"acc_norm": 0.28938906752411575,
"acc_norm_stderr": 0.025755865922632924
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22839506172839505,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.22839506172839505,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902006,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2405475880052151,
"acc_stderr": 0.01091640673547895,
"acc_norm": 0.2405475880052151,
"acc_norm_stderr": 0.01091640673547895
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4522058823529412,
"acc_stderr": 0.030233758551596452,
"acc_norm": 0.4522058823529412,
"acc_norm_stderr": 0.030233758551596452
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.19090909090909092,
"acc_stderr": 0.03764425585984924,
"acc_norm": 0.19090909090909092,
"acc_norm_stderr": 0.03764425585984924
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24897959183673468,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.24897959183673468,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.20398009950248755,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.20398009950248755,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.035650796707083106,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.035650796707083106
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.4416088801457481,
"mc2_stderr": 0.01524734599791119
},
"harness|winogrande|5": {
"acc": 0.5130228887134964,
"acc_stderr": 0.014047718393997663
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Felladrin__Llama-160M-Chat-v1 | [
"region:us"
] | 2023-12-23T16:13:03+00:00 | {"pretty_name": "Evaluation run of Felladrin/Llama-160M-Chat-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [Felladrin/Llama-160M-Chat-v1](https://huggingface.co/Felladrin/Llama-160M-Chat-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Felladrin__Llama-160M-Chat-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T16:11:07.691386](https://huggingface.co/datasets/open-llm-leaderboard/details_Felladrin__Llama-160M-Chat-v1/blob/main/results_2023-12-23T16-11-07.691386.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.261252659291333,\n \"acc_stderr\": 0.030837999574303904,\n \"acc_norm\": 0.2626209462127548,\n \"acc_norm_stderr\": 0.0316573603453682,\n \"mc1\": 0.24357405140758873,\n \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.4416088801457481,\n \"mc2_stderr\": 0.01524734599791119\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2167235494880546,\n \"acc_stderr\": 0.01204015671348119,\n \"acc_norm\": 0.24744027303754265,\n \"acc_norm_stderr\": 0.012610352663292673\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3123879705238,\n \"acc_stderr\": 0.004625198756710239,\n \"acc_norm\": 0.3529177454690301,\n \"acc_norm_stderr\": 0.004769007545082275\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.03749850709174023,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.03749850709174023\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.02605529690115292,\n \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.02605529690115292\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2847222222222222,\n \"acc_stderr\": 0.037738099906869355,\n \"acc_norm\": 0.2847222222222222,\n \"acc_norm_stderr\": 0.037738099906869355\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749898,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749898\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993178,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993178\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.0345593020192481,\n \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.0345593020192481\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.02264421261552521,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.02264421261552521\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3096774193548387,\n \"acc_stderr\": 0.026302774983517414,\n \"acc_norm\": 0.3096774193548387,\n \"acc_norm_stderr\": 0.026302774983517414\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.18787878787878787,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.18787878787878787,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.26262626262626265,\n \"acc_stderr\": 0.031353050095330855,\n \"acc_norm\": 0.26262626262626265,\n \"acc_norm_stderr\": 0.031353050095330855\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.3160621761658031,\n \"acc_stderr\": 0.03355397369686173,\n \"acc_norm\": 0.3160621761658031,\n \"acc_norm_stderr\": 0.03355397369686173\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3435897435897436,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.3435897435897436,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2815126050420168,\n \"acc_stderr\": 0.02921354941437216,\n \"acc_norm\": 0.2815126050420168,\n \"acc_norm_stderr\": 0.02921354941437216\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23669724770642203,\n \"acc_stderr\": 0.018224078117299085,\n \"acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.018224078117299085\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.20098039215686275,\n \"acc_stderr\": 0.028125972265654386,\n \"acc_norm\": 0.20098039215686275,\n \"acc_norm_stderr\": 0.028125972265654386\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.27848101265822783,\n \"acc_stderr\": 0.029178682304842538,\n \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.029178682304842538\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21076233183856502,\n \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.21076233183856502,\n \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212094,\n \"acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.19642857142857142,\n \"acc_stderr\": 0.03770970049347018,\n \"acc_norm\": 0.19642857142857142,\n \"acc_norm_stderr\": 0.03770970049347018\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20512820512820512,\n \"acc_stderr\": 0.026453508054040332,\n \"acc_norm\": 0.20512820512820512,\n \"acc_norm_stderr\": 0.026453508054040332\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n \"acc_stderr\": 0.01576998484069052,\n \"acc_norm\": 0.26436781609195403,\n \"acc_norm_stderr\": 0.01576998484069052\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.023176298203992002,\n \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.023176298203992002\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22346368715083798,\n \"acc_stderr\": 0.013932068638579752,\n \"acc_norm\": 0.22346368715083798,\n \"acc_norm_stderr\": 0.013932068638579752\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.28938906752411575,\n \"acc_stderr\": 0.025755865922632924,\n \"acc_norm\": 0.28938906752411575,\n \"acc_norm_stderr\": 0.025755865922632924\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.22839506172839505,\n \"acc_stderr\": 0.023358211840626267,\n \"acc_norm\": 0.22839506172839505,\n \"acc_norm_stderr\": 0.023358211840626267\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902006,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902006\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2405475880052151,\n \"acc_stderr\": 0.01091640673547895,\n \"acc_norm\": 0.2405475880052151,\n \"acc_norm_stderr\": 0.01091640673547895\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4522058823529412,\n \"acc_stderr\": 0.030233758551596452,\n \"acc_norm\": 0.4522058823529412,\n \"acc_norm_stderr\": 0.030233758551596452\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.19090909090909092,\n \"acc_stderr\": 0.03764425585984924,\n \"acc_norm\": 0.19090909090909092,\n \"acc_norm_stderr\": 0.03764425585984924\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.027682979522960234,\n \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.027682979522960234\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.20398009950248755,\n \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.20398009950248755,\n \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.035650796707083106,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.035650796707083106\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.4416088801457481,\n \"mc2_stderr\": 0.01524734599791119\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5130228887134964,\n \"acc_stderr\": 0.014047718393997663\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Felladrin/Llama-160M-Chat-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-11-07.691386.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["**/details_harness|winogrande|5_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T16-11-07.691386.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T16_11_07.691386", "path": ["results_2023-12-23T16-11-07.691386.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T16-11-07.691386.parquet"]}]}]} | 2023-12-23T16:13:30+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Felladrin/Llama-160M-Chat-v1
Dataset automatically created during the evaluation run of model Felladrin/Llama-160M-Chat-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T16:11:07.691386(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Felladrin/Llama-160M-Chat-v1\n\n\n\nDataset automatically created during the evaluation run of model Felladrin/Llama-160M-Chat-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:11:07.691386(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Felladrin/Llama-160M-Chat-v1\n\n\n\nDataset automatically created during the evaluation run of model Felladrin/Llama-160M-Chat-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:11:07.691386(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Felladrin/Llama-160M-Chat-v1\n\n\n\nDataset automatically created during the evaluation run of model Felladrin/Llama-160M-Chat-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T16:11:07.691386(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
0b80ff3ed285fd60bbdffd083e2093e55d437de5 |
# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-02-v0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [EmbeddedLLM/Mistral-7B-Merge-02-v0](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-02-v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-02-v0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T16:13:04.956201](https://huggingface.co/datasets/open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-02-v0/blob/main/results_2023-12-23T16-13-04.956201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6444747200521589,
"acc_stderr": 0.032006692465818394,
"acc_norm": 0.645214523345659,
"acc_norm_stderr": 0.03265305081994223,
"mc1": 0.4430844553243574,
"mc1_stderr": 0.017389730346877103,
"mc2": 0.6051772410999124,
"mc2_stderr": 0.01537548359714006
},
"harness|arc:challenge|25": {
"acc": 0.64419795221843,
"acc_stderr": 0.013990571137918763,
"acc_norm": 0.6749146757679181,
"acc_norm_stderr": 0.013688147309729125
},
"harness|hellaswag|10": {
"acc": 0.6712806213901613,
"acc_stderr": 0.004687877183164464,
"acc_norm": 0.8577972515435173,
"acc_norm_stderr": 0.0034854418127129535
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.024283140529467305,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.024283140529467305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066482,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.02636165166838909,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.02636165166838909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.01362555690799345,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.01362555690799345
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36312849162011174,
"acc_stderr": 0.016083749986853697,
"acc_norm": 0.36312849162011174,
"acc_norm_stderr": 0.016083749986853697
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578323,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578323
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4430844553243574,
"mc1_stderr": 0.017389730346877103,
"mc2": 0.6051772410999124,
"mc2_stderr": 0.01537548359714006
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.01144628062926263
},
"harness|gsm8k|5": {
"acc": 0.6724791508718726,
"acc_stderr": 0.012927102210426727
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-02-v0 | [
"region:us"
] | 2023-12-23T16:15:21+00:00 | {"pretty_name": "Evaluation run of EmbeddedLLM/Mistral-7B-Merge-02-v0", "dataset_summary": "Dataset automatically created during the evaluation run of model [EmbeddedLLM/Mistral-7B-Merge-02-v0](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-02-v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-02-v0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T16:13:04.956201](https://huggingface.co/datasets/open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-02-v0/blob/main/results_2023-12-23T16-13-04.956201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6444747200521589,\n \"acc_stderr\": 0.032006692465818394,\n \"acc_norm\": 0.645214523345659,\n \"acc_norm_stderr\": 0.03265305081994223,\n \"mc1\": 0.4430844553243574,\n \"mc1_stderr\": 0.017389730346877103,\n \"mc2\": 0.6051772410999124,\n \"mc2_stderr\": 0.01537548359714006\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.64419795221843,\n \"acc_stderr\": 0.013990571137918763,\n \"acc_norm\": 0.6749146757679181,\n \"acc_norm_stderr\": 0.013688147309729125\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6712806213901613,\n \"acc_stderr\": 0.004687877183164464,\n \"acc_norm\": 0.8577972515435173,\n \"acc_norm_stderr\": 0.0034854418127129535\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.024283140529467305,\n \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.024283140529467305\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.02636165166838909,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.02636165166838909\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.01362555690799345,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.01362555690799345\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n \"acc_stderr\": 0.016083749986853697,\n \"acc_norm\": 0.36312849162011174,\n \"acc_norm_stderr\": 0.016083749986853697\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578323,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578323\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4430844553243574,\n \"mc1_stderr\": 0.017389730346877103,\n \"mc2\": 0.6051772410999124,\n \"mc2_stderr\": 0.01537548359714006\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6724791508718726,\n \"acc_stderr\": 0.012927102210426727\n }\n}\n```", "repo_url": "https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-02-v0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-13-04.956201.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["**/details_harness|winogrande|5_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T16-13-04.956201.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T16_13_04.956201", "path": ["results_2023-12-23T16-13-04.956201.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T16-13-04.956201.parquet"]}]}]} | 2023-12-23T16:15:42+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-02-v0
Dataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-02-v0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T16:13:04.956201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-02-v0\n\n\n\nDataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-02-v0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:13:04.956201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-02-v0\n\n\n\nDataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-02-v0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:13:04.956201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-02-v0\n\n\n\nDataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-02-v0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T16:13:04.956201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
d1114a8e9ab1c9237472bea3de889fc5ea8a25b1 |
# Dataset Card for Evaluation run of Sao10K/Frostwind-10.7B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sao10K/Frostwind-10.7B-v1](https://huggingface.co/Sao10K/Frostwind-10.7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Frostwind-10.7B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T15:02:48.376672](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Frostwind-10.7B-v1/blob/main/results_2023-12-24T15-02-48.376672.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6456554215200824,
"acc_stderr": 0.03196591792419325,
"acc_norm": 0.6483076503394334,
"acc_norm_stderr": 0.03261595551857736,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5040791546532097,
"mc2_stderr": 0.015306227142349391
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.014312094557946705,
"acc_norm": 0.6399317406143344,
"acc_norm_stderr": 0.014027516814585188
},
"harness|hellaswag|10": {
"acc": 0.6618203545110536,
"acc_stderr": 0.004721231637092722,
"acc_norm": 0.8536148177653854,
"acc_norm_stderr": 0.003527695149823495
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.0487831731214563,
"acc_norm": 0.38,
"acc_norm_stderr": 0.0487831731214563
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.03459777606810536,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.03459777606810536
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777025,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777025
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727062,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727062
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.02402225613030823,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.02402225613030823
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8232323232323232,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.8232323232323232,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083025,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083025
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009246,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009246
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.033509916046960436,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.033509916046960436
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553325,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632453,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632453
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291947,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291947
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980981,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980981
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579828,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579828
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508297,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508297
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135118,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135118
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4921773142112125,
"acc_stderr": 0.012768673076111903,
"acc_norm": 0.4921773142112125,
"acc_norm_stderr": 0.012768673076111903
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.018798086284886887,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.018798086284886887
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.02737294220178816,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.02737294220178816
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306032,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306032
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5040791546532097,
"mc2_stderr": 0.015306227142349391
},
"harness|winogrande|5": {
"acc": 0.8382004735595896,
"acc_stderr": 0.010350128010292404
},
"harness|gsm8k|5": {
"acc": 0.5276724791508719,
"acc_stderr": 0.013751375538801326
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Sao10K__Frostwind-10.7B-v1 | [
"region:us"
] | 2023-12-23T16:16:55+00:00 | {"pretty_name": "Evaluation run of Sao10K/Frostwind-10.7B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [Sao10K/Frostwind-10.7B-v1](https://huggingface.co/Sao10K/Frostwind-10.7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Frostwind-10.7B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-24T15:02:48.376672](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Frostwind-10.7B-v1/blob/main/results_2023-12-24T15-02-48.376672.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6456554215200824,\n \"acc_stderr\": 0.03196591792419325,\n \"acc_norm\": 0.6483076503394334,\n \"acc_norm_stderr\": 0.03261595551857736,\n \"mc1\": 0.3537331701346389,\n \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5040791546532097,\n \"mc2_stderr\": 0.015306227142349391\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.014312094557946705,\n \"acc_norm\": 0.6399317406143344,\n \"acc_norm_stderr\": 0.014027516814585188\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6618203545110536,\n \"acc_stderr\": 0.004721231637092722,\n \"acc_norm\": 0.8536148177653854,\n \"acc_norm_stderr\": 0.003527695149823495\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.0487831731214563,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.0487831731214563\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.03459777606810536,\n \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.03459777606810536\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777025,\n \"acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777025\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.04375888492727062,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.04375888492727062\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.02402225613030823,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.02402225613030823\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8232323232323232,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.8232323232323232,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083025,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083025\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009246,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009246\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.033509916046960436,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.033509916046960436\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553325,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632453,\n \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632453\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.030360379710291947,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.030360379710291947\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.03957835471980981,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.03957835471980981\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579828,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579828\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508297,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508297\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135118,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135118\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4921773142112125,\n \"acc_stderr\": 0.012768673076111903,\n \"acc_norm\": 0.4921773142112125,\n \"acc_norm_stderr\": 0.012768673076111903\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.684640522875817,\n \"acc_stderr\": 0.018798086284886887,\n \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.018798086284886887\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306032,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306032\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5040791546532097,\n \"mc2_stderr\": 0.015306227142349391\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8382004735595896,\n \"acc_stderr\": 0.010350128010292404\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5276724791508719,\n \"acc_stderr\": 0.013751375538801326\n }\n}\n```", "repo_url": "https://huggingface.co/Sao10K/Frostwind-10.7B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|arc:challenge|25_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|gsm8k|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hellaswag|10_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-14-40.601106.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T15-02-48.376672.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["**/details_harness|winogrande|5_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["**/details_harness|winogrande|5_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-24T15-02-48.376672.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T16_14_40.601106", "path": ["results_2023-12-23T16-14-40.601106.parquet"]}, {"split": "2023_12_24T15_02_48.376672", "path": ["results_2023-12-24T15-02-48.376672.parquet"]}, {"split": "latest", "path": ["results_2023-12-24T15-02-48.376672.parquet"]}]}]} | 2023-12-24T15:05:25+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Sao10K/Frostwind-10.7B-v1
Dataset automatically created during the evaluation run of model Sao10K/Frostwind-10.7B-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-24T15:02:48.376672(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Sao10K/Frostwind-10.7B-v1\n\n\n\nDataset automatically created during the evaluation run of model Sao10K/Frostwind-10.7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T15:02:48.376672(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Sao10K/Frostwind-10.7B-v1\n\n\n\nDataset automatically created during the evaluation run of model Sao10K/Frostwind-10.7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T15:02:48.376672(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Sao10K/Frostwind-10.7B-v1\n\n\n\nDataset automatically created during the evaluation run of model Sao10K/Frostwind-10.7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-24T15:02:48.376672(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
376598a1be79df52832d2870a5989bd07e646cbe |
# Dataset Card for Evaluation run of Zangs3011/codellama_7b_DolphinCoder
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Zangs3011/codellama_7b_DolphinCoder](https://huggingface.co/Zangs3011/codellama_7b_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Zangs3011__codellama_7b_DolphinCoder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T16:19:11.298968](https://huggingface.co/datasets/open-llm-leaderboard/details_Zangs3011__codellama_7b_DolphinCoder/blob/main/results_2023-12-23T16-19-11.298968.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3827239446273333,
"acc_stderr": 0.034226432114737984,
"acc_norm": 0.3863708183260275,
"acc_norm_stderr": 0.03501715050425477,
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487291,
"mc2": 0.35450592505891126,
"mc2_stderr": 0.014292262562897113
},
"harness|arc:challenge|25": {
"acc": 0.39761092150170646,
"acc_stderr": 0.014301752223279536,
"acc_norm": 0.4197952218430034,
"acc_norm_stderr": 0.014422181226303026
},
"harness|hellaswag|10": {
"acc": 0.49432383987253536,
"acc_stderr": 0.004989459871609184,
"acc_norm": 0.6550487950607449,
"acc_norm_stderr": 0.004743808792037848
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34868421052631576,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.34868421052631576,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.33962264150943394,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.33962264150943394,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364397,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364397
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.029896145682095455,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.029896145682095455
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643898,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.36774193548387096,
"acc_stderr": 0.02743086657997347,
"acc_norm": 0.36774193548387096,
"acc_norm_stderr": 0.02743086657997347
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.030315099285617715,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.030315099285617715
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4121212121212121,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.4121212121212121,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.46464646464646464,
"acc_stderr": 0.03553436368828063,
"acc_norm": 0.46464646464646464,
"acc_norm_stderr": 0.03553436368828063
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.45595854922279794,
"acc_stderr": 0.035944137112724366,
"acc_norm": 0.45595854922279794,
"acc_norm_stderr": 0.035944137112724366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36153846153846153,
"acc_stderr": 0.024359581465396987,
"acc_norm": 0.36153846153846153,
"acc_norm_stderr": 0.024359581465396987
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228402,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228402
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3739495798319328,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.3739495798319328,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073328,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073328
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.48073394495412847,
"acc_stderr": 0.021421402982548878,
"acc_norm": 0.48073394495412847,
"acc_norm_stderr": 0.021421402982548878
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.034542365853806094,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.034542365853806094
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4810126582278481,
"acc_stderr": 0.03252375148090448,
"acc_norm": 0.4810126582278481,
"acc_norm_stderr": 0.03252375148090448
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4080717488789238,
"acc_stderr": 0.03298574607842821,
"acc_norm": 0.4080717488789238,
"acc_norm_stderr": 0.03298574607842821
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.33587786259541985,
"acc_stderr": 0.04142313771996665,
"acc_norm": 0.33587786259541985,
"acc_norm_stderr": 0.04142313771996665
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4380165289256198,
"acc_stderr": 0.045291468044357915,
"acc_norm": 0.4380165289256198,
"acc_norm_stderr": 0.045291468044357915
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3987730061349693,
"acc_stderr": 0.03847021420456026,
"acc_norm": 0.3987730061349693,
"acc_norm_stderr": 0.03847021420456026
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.47572815533980584,
"acc_stderr": 0.049449010929737795,
"acc_norm": 0.47572815533980584,
"acc_norm_stderr": 0.049449010929737795
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5726495726495726,
"acc_stderr": 0.032408473935163266,
"acc_norm": 0.5726495726495726,
"acc_norm_stderr": 0.032408473935163266
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.017784034534992433,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.017784034534992433
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.026362437574546545,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.026362437574546545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331154,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331154
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.02845263998508801,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.02845263998508801
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4180064308681672,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.4180064308681672,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.027586006221607715,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.027586006221607715
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30141843971631205,
"acc_stderr": 0.027374128882631146,
"acc_norm": 0.30141843971631205,
"acc_norm_stderr": 0.027374128882631146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.303129074315515,
"acc_stderr": 0.011738669951254293,
"acc_norm": 0.303129074315515,
"acc_norm_stderr": 0.011738669951254293
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35661764705882354,
"acc_stderr": 0.029097209568411955,
"acc_norm": 0.35661764705882354,
"acc_norm_stderr": 0.029097209568411955
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.01933314202079706,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.01933314202079706
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.04782001791380063,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.04782001791380063
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40816326530612246,
"acc_stderr": 0.03146465712827424,
"acc_norm": 0.40816326530612246,
"acc_norm_stderr": 0.03146465712827424
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5124378109452736,
"acc_stderr": 0.03534439848539579,
"acc_norm": 0.5124378109452736,
"acc_norm_stderr": 0.03534439848539579
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322415,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322415
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.47953216374269003,
"acc_stderr": 0.038316105328219316,
"acc_norm": 0.47953216374269003,
"acc_norm_stderr": 0.038316105328219316
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487291,
"mc2": 0.35450592505891126,
"mc2_stderr": 0.014292262562897113
},
"harness|winogrande|5": {
"acc": 0.6361483820047356,
"acc_stderr": 0.013521488896883408
},
"harness|gsm8k|5": {
"acc": 0.09704321455648218,
"acc_stderr": 0.008153768274554735
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Zangs3011__codellama_7b_DolphinCoder | [
"region:us"
] | 2023-12-23T16:21:32+00:00 | {"pretty_name": "Evaluation run of Zangs3011/codellama_7b_DolphinCoder", "dataset_summary": "Dataset automatically created during the evaluation run of model [Zangs3011/codellama_7b_DolphinCoder](https://huggingface.co/Zangs3011/codellama_7b_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Zangs3011__codellama_7b_DolphinCoder\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T16:19:11.298968](https://huggingface.co/datasets/open-llm-leaderboard/details_Zangs3011__codellama_7b_DolphinCoder/blob/main/results_2023-12-23T16-19-11.298968.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3827239446273333,\n \"acc_stderr\": 0.034226432114737984,\n \"acc_norm\": 0.3863708183260275,\n \"acc_norm_stderr\": 0.03501715050425477,\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.014509045171487291,\n \"mc2\": 0.35450592505891126,\n \"mc2_stderr\": 0.014292262562897113\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.39761092150170646,\n \"acc_stderr\": 0.014301752223279536,\n \"acc_norm\": 0.4197952218430034,\n \"acc_norm_stderr\": 0.014422181226303026\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.49432383987253536,\n \"acc_stderr\": 0.004989459871609184,\n \"acc_norm\": 0.6550487950607449,\n \"acc_norm_stderr\": 0.004743808792037848\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.34868421052631576,\n \"acc_stderr\": 0.038781398887976104,\n \"acc_norm\": 0.34868421052631576,\n \"acc_norm_stderr\": 0.038781398887976104\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.33962264150943394,\n \"acc_stderr\": 0.02914690474779833,\n \"acc_norm\": 0.33962264150943394,\n \"acc_norm_stderr\": 0.02914690474779833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.3236994219653179,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364397,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364397\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.029896145682095455,\n \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.029896145682095455\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.36774193548387096,\n \"acc_stderr\": 0.02743086657997347,\n \"acc_norm\": 0.36774193548387096,\n \"acc_norm_stderr\": 0.02743086657997347\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617715,\n \"acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617715\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.4121212121212121,\n \"acc_stderr\": 0.03843566993588717,\n \"acc_norm\": 0.4121212121212121,\n \"acc_norm_stderr\": 0.03843566993588717\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.46464646464646464,\n \"acc_stderr\": 0.03553436368828063,\n \"acc_norm\": 0.46464646464646464,\n \"acc_norm_stderr\": 0.03553436368828063\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.45595854922279794,\n \"acc_stderr\": 0.035944137112724366,\n \"acc_norm\": 0.45595854922279794,\n \"acc_norm_stderr\": 0.035944137112724366\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.36153846153846153,\n \"acc_stderr\": 0.024359581465396987,\n \"acc_norm\": 0.36153846153846153,\n \"acc_norm_stderr\": 0.024359581465396987\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3739495798319328,\n \"acc_stderr\": 0.031429466378837076,\n \"acc_norm\": 0.3739495798319328,\n \"acc_norm_stderr\": 0.031429466378837076\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073328,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073328\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.48073394495412847,\n \"acc_stderr\": 0.021421402982548878,\n \"acc_norm\": 0.48073394495412847,\n \"acc_norm_stderr\": 0.021421402982548878\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.034542365853806094,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.034542365853806094\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.4810126582278481,\n \"acc_stderr\": 0.03252375148090448,\n \"acc_norm\": 0.4810126582278481,\n \"acc_norm_stderr\": 0.03252375148090448\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4080717488789238,\n \"acc_stderr\": 0.03298574607842821,\n \"acc_norm\": 0.4080717488789238,\n \"acc_norm_stderr\": 0.03298574607842821\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.33587786259541985,\n \"acc_stderr\": 0.04142313771996665,\n \"acc_norm\": 0.33587786259541985,\n \"acc_norm_stderr\": 0.04142313771996665\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.4380165289256198,\n \"acc_stderr\": 0.045291468044357915,\n \"acc_norm\": 0.4380165289256198,\n \"acc_norm_stderr\": 0.045291468044357915\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3987730061349693,\n \"acc_stderr\": 0.03847021420456026,\n \"acc_norm\": 0.3987730061349693,\n \"acc_norm_stderr\": 0.03847021420456026\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.47572815533980584,\n \"acc_stderr\": 0.049449010929737795,\n \"acc_norm\": 0.47572815533980584,\n \"acc_norm_stderr\": 0.049449010929737795\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5726495726495726,\n \"acc_stderr\": 0.032408473935163266,\n \"acc_norm\": 0.5726495726495726,\n \"acc_norm_stderr\": 0.032408473935163266\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.017784034534992433,\n \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.017784034534992433\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3988439306358382,\n \"acc_stderr\": 0.026362437574546545,\n \"acc_norm\": 0.3988439306358382,\n \"acc_norm_stderr\": 0.026362437574546545\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331154,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331154\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.02845263998508801,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.02845263998508801\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4180064308681672,\n \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.4180064308681672,\n \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.027586006221607715,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.027586006221607715\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.30141843971631205,\n \"acc_stderr\": 0.027374128882631146,\n \"acc_norm\": 0.30141843971631205,\n \"acc_norm_stderr\": 0.027374128882631146\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.303129074315515,\n \"acc_stderr\": 0.011738669951254293,\n \"acc_norm\": 0.303129074315515,\n \"acc_norm_stderr\": 0.011738669951254293\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.35661764705882354,\n \"acc_stderr\": 0.029097209568411955,\n \"acc_norm\": 0.35661764705882354,\n \"acc_norm_stderr\": 0.029097209568411955\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.01933314202079706,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.01933314202079706\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.40816326530612246,\n \"acc_stderr\": 0.03146465712827424,\n \"acc_norm\": 0.40816326530612246,\n \"acc_norm_stderr\": 0.03146465712827424\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5124378109452736,\n \"acc_stderr\": 0.03534439848539579,\n \"acc_norm\": 0.5124378109452736,\n \"acc_norm_stderr\": 0.03534439848539579\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n \"acc_stderr\": 0.03726214354322415,\n \"acc_norm\": 0.35542168674698793,\n \"acc_norm_stderr\": 0.03726214354322415\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.47953216374269003,\n \"acc_stderr\": 0.038316105328219316,\n \"acc_norm\": 0.47953216374269003,\n \"acc_norm_stderr\": 0.038316105328219316\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.014509045171487291,\n \"mc2\": 0.35450592505891126,\n \"mc2_stderr\": 0.014292262562897113\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6361483820047356,\n \"acc_stderr\": 0.013521488896883408\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09704321455648218,\n \"acc_stderr\": 0.008153768274554735\n }\n}\n```", "repo_url": "https://huggingface.co/Zangs3011/codellama_7b_DolphinCoder", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-19-11.298968.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["**/details_harness|winogrande|5_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T16-19-11.298968.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T16_19_11.298968", "path": ["results_2023-12-23T16-19-11.298968.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T16-19-11.298968.parquet"]}]}]} | 2023-12-23T16:21:53+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Zangs3011/codellama_7b_DolphinCoder
Dataset automatically created during the evaluation run of model Zangs3011/codellama_7b_DolphinCoder on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T16:19:11.298968(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Zangs3011/codellama_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model Zangs3011/codellama_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:19:11.298968(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Zangs3011/codellama_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model Zangs3011/codellama_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:19:11.298968(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Zangs3011/codellama_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model Zangs3011/codellama_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T16:19:11.298968(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
2397270e34349dd44a73d8cf7bc8a73283042d29 |
# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [EmbeddedLLM/Mistral-7B-Merge-14-v0.3](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T16:20:58.598253](https://huggingface.co/datasets/open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.3/blob/main/results_2023-12-23T16-20-58.598253.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6461774129433163,
"acc_stderr": 0.032111885448396486,
"acc_norm": 0.6473493445530335,
"acc_norm_stderr": 0.03275775413439696,
"mc1": 0.40024479804161567,
"mc1_stderr": 0.017151605555749138,
"mc2": 0.5780394878984443,
"mc2_stderr": 0.015529814806437723
},
"harness|arc:challenge|25": {
"acc": 0.6160409556313993,
"acc_stderr": 0.01421244498065189,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.013847460518892978
},
"harness|hellaswag|10": {
"acc": 0.6740689105755826,
"acc_stderr": 0.004677637463391395,
"acc_norm": 0.8529177454690301,
"acc_norm_stderr": 0.0035346403488166708
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342856,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342856
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240634,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240634
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.02531049537694486,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.02531049537694486
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834834,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834834
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508283,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36312849162011174,
"acc_stderr": 0.0160837499868537,
"acc_norm": 0.36312849162011174,
"acc_norm_stderr": 0.0160837499868537
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729477,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729477
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341063,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214961,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214961
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162666,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162666
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072766,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072766
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40024479804161567,
"mc1_stderr": 0.017151605555749138,
"mc2": 0.5780394878984443,
"mc2_stderr": 0.015529814806437723
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.011585871710209413
},
"harness|gsm8k|5": {
"acc": 0.6626231993934799,
"acc_stderr": 0.013023665136222084
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.3 | [
"region:us"
] | 2023-12-23T16:23:13+00:00 | {"pretty_name": "Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3", "dataset_summary": "Dataset automatically created during the evaluation run of model [EmbeddedLLM/Mistral-7B-Merge-14-v0.3](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T16:20:58.598253](https://huggingface.co/datasets/open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.3/blob/main/results_2023-12-23T16-20-58.598253.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6461774129433163,\n \"acc_stderr\": 0.032111885448396486,\n \"acc_norm\": 0.6473493445530335,\n \"acc_norm_stderr\": 0.03275775413439696,\n \"mc1\": 0.40024479804161567,\n \"mc1_stderr\": 0.017151605555749138,\n \"mc2\": 0.5780394878984443,\n \"mc2_stderr\": 0.015529814806437723\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6160409556313993,\n \"acc_stderr\": 0.01421244498065189,\n \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892978\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6740689105755826,\n \"acc_stderr\": 0.004677637463391395,\n \"acc_norm\": 0.8529177454690301,\n \"acc_norm_stderr\": 0.0035346403488166708\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342856,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342856\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240634,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240634\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.02531049537694486,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.02531049537694486\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834834,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834834\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508283,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n \"acc_stderr\": 0.0160837499868537,\n \"acc_norm\": 0.36312849162011174,\n \"acc_norm_stderr\": 0.0160837499868537\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729477,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729477\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n \"acc_stderr\": 0.02521804037341063,\n \"acc_norm\": 0.729903536977492,\n \"acc_norm_stderr\": 0.02521804037341063\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n \"acc_stderr\": 0.012736153390214961,\n \"acc_norm\": 0.4634941329856584,\n \"acc_norm_stderr\": 0.012736153390214961\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162666,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162666\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072766,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072766\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40024479804161567,\n \"mc1_stderr\": 0.017151605555749138,\n \"mc2\": 0.5780394878984443,\n \"mc2_stderr\": 0.015529814806437723\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.011585871710209413\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6626231993934799,\n \"acc_stderr\": 0.013023665136222084\n }\n}\n```", "repo_url": "https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-20-58.598253.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["**/details_harness|winogrande|5_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T16-20-58.598253.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T16_20_58.598253", "path": ["results_2023-12-23T16-20-58.598253.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T16-20-58.598253.parquet"]}]}]} | 2023-12-23T16:23:34+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3
Dataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-14-v0.3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T16:20:58.598253(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3\n\n\n\nDataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-14-v0.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:20:58.598253(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3\n\n\n\nDataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-14-v0.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:20:58.598253(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3\n\n\n\nDataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-14-v0.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T16:20:58.598253(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
4b34c59affb881ca0fec81e858290f5f53ae5c69 |
# Dataset Card for Evaluation run of MexIvanov/zephyr-python-ru
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MexIvanov/zephyr-python-ru](https://huggingface.co/MexIvanov/zephyr-python-ru) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MexIvanov__zephyr-python-ru",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T16:26:04.991527](https://huggingface.co/datasets/open-llm-leaderboard/details_MexIvanov__zephyr-python-ru/blob/main/results_2023-12-23T16-26-04.991527.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5991461225741262,
"acc_stderr": 0.03306015344516284,
"acc_norm": 0.6048288788808908,
"acc_norm_stderr": 0.033742531689769865,
"mc1": 0.37821297429620565,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.5280108433994529,
"mc2_stderr": 0.015317682476455754
},
"harness|arc:challenge|25": {
"acc": 0.5290102389078498,
"acc_stderr": 0.014586776355294314,
"acc_norm": 0.5614334470989761,
"acc_norm_stderr": 0.014500682618212864
},
"harness|hellaswag|10": {
"acc": 0.6224855606452898,
"acc_stderr": 0.004837744647345717,
"acc_norm": 0.8202549292969528,
"acc_norm_stderr": 0.0038319023702881065
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.03988903703336284,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.03988903703336284
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332786,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332786
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7258064516129032,
"acc_stderr": 0.0253781399708852,
"acc_norm": 0.7258064516129032,
"acc_norm_stderr": 0.0253781399708852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365886,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365886
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.02469721693087894,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.02469721693087894
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.01714985851425095,
"acc_norm": 0.8,
"acc_norm_stderr": 0.01714985851425095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145635
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598025,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598025
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7905491698595147,
"acc_stderr": 0.014551310568143698,
"acc_norm": 0.7905491698595147,
"acc_norm_stderr": 0.014551310568143698
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242832,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242832
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.17206703910614526,
"acc_stderr": 0.012623438533220628,
"acc_norm": 0.17206703910614526,
"acc_norm_stderr": 0.012623438533220628
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.026992544339297236,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.026992544339297236
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399662,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399662
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409828,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409828
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.02965823509766691,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.02965823509766691
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42698826597131684,
"acc_stderr": 0.012633353557534425,
"acc_norm": 0.42698826597131684,
"acc_norm_stderr": 0.012633353557534425
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.019659922493623343,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.019659922493623343
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37821297429620565,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.5280108433994529,
"mc2_stderr": 0.015317682476455754
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827943
},
"harness|gsm8k|5": {
"acc": 0.3252463987869598,
"acc_stderr": 0.01290390475254392
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_MexIvanov__zephyr-python-ru | [
"region:us"
] | 2023-12-23T16:28:21+00:00 | {"pretty_name": "Evaluation run of MexIvanov/zephyr-python-ru", "dataset_summary": "Dataset automatically created during the evaluation run of model [MexIvanov/zephyr-python-ru](https://huggingface.co/MexIvanov/zephyr-python-ru) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MexIvanov__zephyr-python-ru\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T16:26:04.991527](https://huggingface.co/datasets/open-llm-leaderboard/details_MexIvanov__zephyr-python-ru/blob/main/results_2023-12-23T16-26-04.991527.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5991461225741262,\n \"acc_stderr\": 0.03306015344516284,\n \"acc_norm\": 0.6048288788808908,\n \"acc_norm_stderr\": 0.033742531689769865,\n \"mc1\": 0.37821297429620565,\n \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.5280108433994529,\n \"mc2_stderr\": 0.015317682476455754\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5290102389078498,\n \"acc_stderr\": 0.014586776355294314,\n \"acc_norm\": 0.5614334470989761,\n \"acc_norm_stderr\": 0.014500682618212864\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6224855606452898,\n \"acc_stderr\": 0.004837744647345717,\n \"acc_norm\": 0.8202549292969528,\n \"acc_norm_stderr\": 0.0038319023702881065\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.03988903703336284,\n \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.03988903703336284\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332786,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332786\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7258064516129032,\n \"acc_stderr\": 0.0253781399708852,\n \"acc_norm\": 0.7258064516129032,\n \"acc_norm_stderr\": 0.0253781399708852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365886,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365886\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.02469721693087894,\n \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.02469721693087894\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.01714985851425095,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.01714985851425095\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145635,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145635\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598025,\n \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598025\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7905491698595147,\n \"acc_stderr\": 0.014551310568143698,\n \"acc_norm\": 0.7905491698595147,\n \"acc_norm_stderr\": 0.014551310568143698\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242832,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242832\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.17206703910614526,\n \"acc_stderr\": 0.012623438533220628,\n \"acc_norm\": 0.17206703910614526,\n \"acc_norm_stderr\": 0.012623438533220628\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.026992544339297236,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.026992544339297236\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399662,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399662\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409828,\n \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409828\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.02965823509766691,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.02965823509766691\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n \"acc_stderr\": 0.012633353557534425,\n \"acc_norm\": 0.42698826597131684,\n \"acc_norm_stderr\": 0.012633353557534425\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.019659922493623343,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.019659922493623343\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37821297429620565,\n \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.5280108433994529,\n \"mc2_stderr\": 0.015317682476455754\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827943\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3252463987869598,\n \"acc_stderr\": 0.01290390475254392\n }\n}\n```", "repo_url": "https://huggingface.co/MexIvanov/zephyr-python-ru", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-26-04.991527.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["**/details_harness|winogrande|5_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T16-26-04.991527.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T16_26_04.991527", "path": ["results_2023-12-23T16-26-04.991527.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T16-26-04.991527.parquet"]}]}]} | 2023-12-23T16:28:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of MexIvanov/zephyr-python-ru
Dataset automatically created during the evaluation run of model MexIvanov/zephyr-python-ru on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T16:26:04.991527(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of MexIvanov/zephyr-python-ru\n\n\n\nDataset automatically created during the evaluation run of model MexIvanov/zephyr-python-ru on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:26:04.991527(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of MexIvanov/zephyr-python-ru\n\n\n\nDataset automatically created during the evaluation run of model MexIvanov/zephyr-python-ru on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:26:04.991527(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of MexIvanov/zephyr-python-ru\n\n\n\nDataset automatically created during the evaluation run of model MexIvanov/zephyr-python-ru on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T16:26:04.991527(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
1946d71d8477ff4a44fa48e17175771d98bf11d6 |
# Dataset Card for Evaluation run of acrastt/kalomaze-stuff
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [acrastt/kalomaze-stuff](https://huggingface.co/acrastt/kalomaze-stuff) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_acrastt__kalomaze-stuff",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T16:26:56.978626](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__kalomaze-stuff/blob/main/results_2023-12-23T16-26-56.978626.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.631028554660595,
"acc_stderr": 0.0324014053372231,
"acc_norm": 0.6369168843854331,
"acc_norm_stderr": 0.03305850958730039,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024647,
"mc2": 0.4163648865691074,
"mc2_stderr": 0.014225165369587414
},
"harness|arc:challenge|25": {
"acc": 0.5648464163822525,
"acc_stderr": 0.014487986197186045,
"acc_norm": 0.5964163822525598,
"acc_norm_stderr": 0.014337158914268447
},
"harness|hellaswag|10": {
"acc": 0.6324437363075085,
"acc_stderr": 0.004811543077792714,
"acc_norm": 0.8354909380601474,
"acc_norm_stderr": 0.0036997919347543672
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137282,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137282
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895525,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.02439667298509476,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.02439667298509476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612917,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612917
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159274,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159274
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159462,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159462
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.03083349114628124,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.03083349114628124
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.04738975119274155,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.04738975119274155
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7994891443167306,
"acc_stderr": 0.014317653708594204,
"acc_norm": 0.7994891443167306,
"acc_norm_stderr": 0.014317653708594204
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.02475241196091721,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.02475241196091721
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3407821229050279,
"acc_stderr": 0.01585200244986211,
"acc_norm": 0.3407821229050279,
"acc_norm_stderr": 0.01585200244986211
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.024954184324879905,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.024954184324879905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4530638852672751,
"acc_stderr": 0.012713845972358978,
"acc_norm": 0.4530638852672751,
"acc_norm_stderr": 0.012713845972358978
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389844,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389844
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050722,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050722
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024647,
"mc2": 0.4163648865691074,
"mc2_stderr": 0.014225165369587414
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.01152446695409025
},
"harness|gsm8k|5": {
"acc": 0.36315390447308565,
"acc_stderr": 0.013246614539839871
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_acrastt__kalomaze-stuff | [
"region:us"
] | 2023-12-23T16:29:14+00:00 | {"pretty_name": "Evaluation run of acrastt/kalomaze-stuff", "dataset_summary": "Dataset automatically created during the evaluation run of model [acrastt/kalomaze-stuff](https://huggingface.co/acrastt/kalomaze-stuff) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_acrastt__kalomaze-stuff\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T16:26:56.978626](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__kalomaze-stuff/blob/main/results_2023-12-23T16-26-56.978626.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.631028554660595,\n \"acc_stderr\": 0.0324014053372231,\n \"acc_norm\": 0.6369168843854331,\n \"acc_norm_stderr\": 0.03305850958730039,\n \"mc1\": 0.2778457772337821,\n \"mc1_stderr\": 0.015680929364024647,\n \"mc2\": 0.4163648865691074,\n \"mc2_stderr\": 0.014225165369587414\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5648464163822525,\n \"acc_stderr\": 0.014487986197186045,\n \"acc_norm\": 0.5964163822525598,\n \"acc_norm_stderr\": 0.014337158914268447\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6324437363075085,\n \"acc_stderr\": 0.004811543077792714,\n \"acc_norm\": 0.8354909380601474,\n \"acc_norm_stderr\": 0.0036997919347543672\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395269,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395269\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137282,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137282\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895525,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.02439667298509476,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.02439667298509476\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612917,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612917\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159274,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159274\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159462,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159462\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628124,\n \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628124\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.04738975119274155,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.04738975119274155\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7994891443167306,\n \"acc_stderr\": 0.014317653708594204,\n \"acc_norm\": 0.7994891443167306,\n \"acc_norm_stderr\": 0.014317653708594204\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.02475241196091721,\n \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.02475241196091721\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3407821229050279,\n \"acc_stderr\": 0.01585200244986211,\n \"acc_norm\": 0.3407821229050279,\n \"acc_norm_stderr\": 0.01585200244986211\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879905,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n \"acc_stderr\": 0.012713845972358978,\n \"acc_norm\": 0.4530638852672751,\n \"acc_norm_stderr\": 0.012713845972358978\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389844,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389844\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050722,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050722\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n \"mc1_stderr\": 0.015680929364024647,\n \"mc2\": 0.4163648865691074,\n \"mc2_stderr\": 0.014225165369587414\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.01152446695409025\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36315390447308565,\n \"acc_stderr\": 0.013246614539839871\n }\n}\n```", "repo_url": "https://huggingface.co/acrastt/kalomaze-stuff", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-26-56.978626.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["**/details_harness|winogrande|5_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T16-26-56.978626.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T16_26_56.978626", "path": ["results_2023-12-23T16-26-56.978626.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T16-26-56.978626.parquet"]}]}]} | 2023-12-23T16:29:37+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of acrastt/kalomaze-stuff
Dataset automatically created during the evaluation run of model acrastt/kalomaze-stuff on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T16:26:56.978626(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of acrastt/kalomaze-stuff\n\n\n\nDataset automatically created during the evaluation run of model acrastt/kalomaze-stuff on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:26:56.978626(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of acrastt/kalomaze-stuff\n\n\n\nDataset automatically created during the evaluation run of model acrastt/kalomaze-stuff on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:26:56.978626(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of acrastt/kalomaze-stuff\n\n\n\nDataset automatically created during the evaluation run of model acrastt/kalomaze-stuff on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T16:26:56.978626(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
982b3778e7d7f130448c74e78ebecb28dd0e0872 |
# Dataset Card for Evaluation run of SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE](https://huggingface.co/SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SanjiWatsuki__Loyal-Toppy-Bruins-Maid-7B-DARE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T17:05:23.693649](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__Loyal-Toppy-Bruins-Maid-7B-DARE/blob/main/results_2023-12-23T17-05-23.693649.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6520506430081576,
"acc_stderr": 0.031997647808783045,
"acc_norm": 0.653156295534757,
"acc_norm_stderr": 0.032642046886080175,
"mc1": 0.4357405140758874,
"mc1_stderr": 0.017358345398863124,
"mc2": 0.6126404146665745,
"mc2_stderr": 0.01563487272923927
},
"harness|arc:challenge|25": {
"acc": 0.6544368600682594,
"acc_stderr": 0.013896938461145683,
"acc_norm": 0.6868600682593856,
"acc_norm_stderr": 0.013552671543623492
},
"harness|hellaswag|10": {
"acc": 0.6833300139414459,
"acc_stderr": 0.0046422680794889395,
"acc_norm": 0.8603863772156941,
"acc_norm_stderr": 0.0034587739347195527
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438665,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438665
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6871794871794872,
"acc_stderr": 0.023507579020645365,
"acc_norm": 0.6871794871794872,
"acc_norm_stderr": 0.023507579020645365
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634286,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634286
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8568807339449541,
"acc_stderr": 0.01501446249716859,
"acc_norm": 0.8568807339449541,
"acc_norm_stderr": 0.01501446249716859
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940876,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940876
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608306,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500097,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500097
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46033519553072627,
"acc_stderr": 0.016669799592112025,
"acc_norm": 0.46033519553072627,
"acc_norm_stderr": 0.016669799592112025
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460842,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4595827900912647,
"acc_stderr": 0.012728446067669957,
"acc_norm": 0.4595827900912647,
"acc_norm_stderr": 0.012728446067669957
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146294,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146294
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.01899970738316267,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.01899970738316267
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291296,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4357405140758874,
"mc1_stderr": 0.017358345398863124,
"mc2": 0.6126404146665745,
"mc2_stderr": 0.01563487272923927
},
"harness|winogrande|5": {
"acc": 0.7955801104972375,
"acc_stderr": 0.011334090612597221
},
"harness|gsm8k|5": {
"acc": 0.6527672479150872,
"acc_stderr": 0.013113898382146877
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_SanjiWatsuki__Loyal-Toppy-Bruins-Maid-7B-DARE | [
"region:us"
] | 2023-12-23T16:35:26+00:00 | {"pretty_name": "Evaluation run of SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE", "dataset_summary": "Dataset automatically created during the evaluation run of model [SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE](https://huggingface.co/SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SanjiWatsuki__Loyal-Toppy-Bruins-Maid-7B-DARE\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T17:05:23.693649](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__Loyal-Toppy-Bruins-Maid-7B-DARE/blob/main/results_2023-12-23T17-05-23.693649.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6520506430081576,\n \"acc_stderr\": 0.031997647808783045,\n \"acc_norm\": 0.653156295534757,\n \"acc_norm_stderr\": 0.032642046886080175,\n \"mc1\": 0.4357405140758874,\n \"mc1_stderr\": 0.017358345398863124,\n \"mc2\": 0.6126404146665745,\n \"mc2_stderr\": 0.01563487272923927\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6544368600682594,\n \"acc_stderr\": 0.013896938461145683,\n \"acc_norm\": 0.6868600682593856,\n \"acc_norm_stderr\": 0.013552671543623492\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6833300139414459,\n \"acc_stderr\": 0.0046422680794889395,\n \"acc_norm\": 0.8603863772156941,\n \"acc_norm_stderr\": 0.0034587739347195527\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438665,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438665\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.023507579020645365,\n \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.023507579020645365\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634286,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634286\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8568807339449541,\n \"acc_stderr\": 0.01501446249716859,\n \"acc_norm\": 0.8568807339449541,\n \"acc_norm_stderr\": 0.01501446249716859\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375797,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375797\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.038498560987940876,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940876\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608306,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608306\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500097,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500097\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46033519553072627,\n \"acc_stderr\": 0.016669799592112025,\n \"acc_norm\": 0.46033519553072627,\n \"acc_norm_stderr\": 0.016669799592112025\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460842,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460842\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4595827900912647,\n \"acc_stderr\": 0.012728446067669957,\n \"acc_norm\": 0.4595827900912647,\n \"acc_norm_stderr\": 0.012728446067669957\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146294,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146294\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.01899970738316267,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.01899970738316267\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4357405140758874,\n \"mc1_stderr\": 0.017358345398863124,\n \"mc2\": 0.6126404146665745,\n \"mc2_stderr\": 0.01563487272923927\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597221\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6527672479150872,\n \"acc_stderr\": 0.013113898382146877\n }\n}\n```", "repo_url": "https://huggingface.co/SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-33-11.430841.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-05-23.693649.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["**/details_harness|winogrande|5_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["**/details_harness|winogrande|5_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T17-05-23.693649.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T16_33_11.430841", "path": ["results_2023-12-23T16-33-11.430841.parquet"]}, {"split": "2023_12_23T17_05_23.693649", "path": ["results_2023-12-23T17-05-23.693649.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T17-05-23.693649.parquet"]}]}]} | 2023-12-23T17:08:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE
Dataset automatically created during the evaluation run of model SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T17:05:23.693649(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE\n\n\n\nDataset automatically created during the evaluation run of model SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:05:23.693649(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE\n\n\n\nDataset automatically created during the evaluation run of model SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:05:23.693649(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
205,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE\n\n\n\nDataset automatically created during the evaluation run of model SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T17:05:23.693649(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]"
] |
86453feeba5742125660b6c7c1b96bc21664b21d |
# Dataset Card for Evaluation run of BlueNipples/TimeCrystal-l2-13B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BlueNipples/TimeCrystal-l2-13B](https://huggingface.co/BlueNipples/TimeCrystal-l2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BlueNipples__TimeCrystal-l2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T16:37:50.678600](https://huggingface.co/datasets/open-llm-leaderboard/details_BlueNipples__TimeCrystal-l2-13B/blob/main/results_2023-12-23T16-37-50.678600.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5645325821600486,
"acc_stderr": 0.033674334876829026,
"acc_norm": 0.5699946755988184,
"acc_norm_stderr": 0.03438069375797606,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5129773945784643,
"mc2_stderr": 0.015576713007621413
},
"harness|arc:challenge|25": {
"acc": 0.5819112627986348,
"acc_stderr": 0.014413988396996074,
"acc_norm": 0.6117747440273038,
"acc_norm_stderr": 0.014241614207414044
},
"harness|hellaswag|10": {
"acc": 0.6447918741286597,
"acc_stderr": 0.00477598265035592,
"acc_norm": 0.8370842461661023,
"acc_norm_stderr": 0.003685340687255413
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.040260970832965634,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.040260970832965634
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.03043779434298305,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.03043779434298305
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929778,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467381,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467381
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.0241804971643769,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.0241804971643769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.02721888977330877,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.02721888977330877
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438804,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438804
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624526,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624526
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.726605504587156,
"acc_stderr": 0.01910929984609828,
"acc_norm": 0.726605504587156,
"acc_norm_stderr": 0.01910929984609828
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653063,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653063
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808503,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808503
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.045416094465039476,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.045416094465039476
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.025372139671722933,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.025372139671722933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398675,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398675
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.026033890613576277,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.026033890613576277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.027634176689602667,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.027634176689602667
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.02726429759980401,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.02726429759980401
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6234567901234568,
"acc_stderr": 0.026959344518747784,
"acc_norm": 0.6234567901234568,
"acc_norm_stderr": 0.026959344518747784
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.029354911159940985,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.029354911159940985
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4335071707953064,
"acc_stderr": 0.012656810383983965,
"acc_norm": 0.4335071707953064,
"acc_norm_stderr": 0.012656810383983965
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5404411764705882,
"acc_stderr": 0.03027332507734575,
"acc_norm": 0.5404411764705882,
"acc_norm_stderr": 0.03027332507734575
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.019955975145835546,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.019955975145835546
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.03071356045510849,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.03071356045510849
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.03220024104534205,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.03220024104534205
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5129773945784643,
"mc2_stderr": 0.015576713007621413
},
"harness|winogrande|5": {
"acc": 0.7537490134175217,
"acc_stderr": 0.012108365307437526
},
"harness|gsm8k|5": {
"acc": 0.27520849128127367,
"acc_stderr": 0.01230211430586265
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BlueNipples__TimeCrystal-l2-13B | [
"region:us"
] | 2023-12-23T16:40:10+00:00 | {"pretty_name": "Evaluation run of BlueNipples/TimeCrystal-l2-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [BlueNipples/TimeCrystal-l2-13B](https://huggingface.co/BlueNipples/TimeCrystal-l2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BlueNipples__TimeCrystal-l2-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T16:37:50.678600](https://huggingface.co/datasets/open-llm-leaderboard/details_BlueNipples__TimeCrystal-l2-13B/blob/main/results_2023-12-23T16-37-50.678600.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5645325821600486,\n \"acc_stderr\": 0.033674334876829026,\n \"acc_norm\": 0.5699946755988184,\n \"acc_norm_stderr\": 0.03438069375797606,\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5129773945784643,\n \"mc2_stderr\": 0.015576713007621413\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5819112627986348,\n \"acc_stderr\": 0.014413988396996074,\n \"acc_norm\": 0.6117747440273038,\n \"acc_norm_stderr\": 0.014241614207414044\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6447918741286597,\n \"acc_stderr\": 0.00477598265035592,\n \"acc_norm\": 0.8370842461661023,\n \"acc_norm_stderr\": 0.003685340687255413\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.040260970832965634,\n \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.040260970832965634\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.03043779434298305,\n \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.03043779434298305\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929778,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929778\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467381,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467381\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.328042328042328,\n \"acc_stderr\": 0.0241804971643769,\n \"acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.0241804971643769\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6451612903225806,\n \"acc_stderr\": 0.02721888977330877,\n \"acc_norm\": 0.6451612903225806,\n \"acc_norm_stderr\": 0.02721888977330877\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438804,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438804\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\": 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624526,\n \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624526\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807257,\n \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807257\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.726605504587156,\n \"acc_stderr\": 0.01910929984609828,\n \"acc_norm\": 0.726605504587156,\n \"acc_norm_stderr\": 0.01910929984609828\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653063,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653063\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869326,\n \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869326\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039476,\n \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039476\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.8162393162393162,\n \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n \"acc_stderr\": 0.015246803197398675,\n \"acc_norm\": 0.7611749680715197,\n \"acc_norm_stderr\": 0.015246803197398675\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.026033890613576277,\n \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.026033890613576277\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.027634176689602667,\n \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.027634176689602667\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n \"acc_stderr\": 0.02726429759980401,\n \"acc_norm\": 0.639871382636656,\n \"acc_norm_stderr\": 0.02726429759980401\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6234567901234568,\n \"acc_stderr\": 0.026959344518747784,\n \"acc_norm\": 0.6234567901234568,\n \"acc_norm_stderr\": 0.026959344518747784\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.41134751773049644,\n \"acc_stderr\": 0.029354911159940985,\n \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.029354911159940985\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n \"acc_stderr\": 0.012656810383983965,\n \"acc_norm\": 0.4335071707953064,\n \"acc_norm_stderr\": 0.012656810383983965\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.03027332507734575,\n \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.03027332507734575\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.019955975145835546,\n \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.019955975145835546\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.03071356045510849,\n \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.03071356045510849\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n \"acc_stderr\": 0.03220024104534205,\n \"acc_norm\": 0.7064676616915423,\n \"acc_norm_stderr\": 0.03220024104534205\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5129773945784643,\n \"mc2_stderr\": 0.015576713007621413\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7537490134175217,\n \"acc_stderr\": 0.012108365307437526\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.27520849128127367,\n \"acc_stderr\": 0.01230211430586265\n }\n}\n```", "repo_url": "https://huggingface.co/BlueNipples/TimeCrystal-l2-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-37-50.678600.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["**/details_harness|winogrande|5_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T16-37-50.678600.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T16_37_50.678600", "path": ["results_2023-12-23T16-37-50.678600.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T16-37-50.678600.parquet"]}]}]} | 2023-12-23T16:40:32+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BlueNipples/TimeCrystal-l2-13B
Dataset automatically created during the evaluation run of model BlueNipples/TimeCrystal-l2-13B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T16:37:50.678600(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BlueNipples/TimeCrystal-l2-13B\n\n\n\nDataset automatically created during the evaluation run of model BlueNipples/TimeCrystal-l2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:37:50.678600(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BlueNipples/TimeCrystal-l2-13B\n\n\n\nDataset automatically created during the evaluation run of model BlueNipples/TimeCrystal-l2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:37:50.678600(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of BlueNipples/TimeCrystal-l2-13B\n\n\n\nDataset automatically created during the evaluation run of model BlueNipples/TimeCrystal-l2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T16:37:50.678600(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
8ff1acbb9d04bb7f2a2bee9a0642200daf6c38b6 |
# Dataset Card for Evaluation run of Dans-DiscountModels/Mistral-7b-FFT-Test3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Dans-DiscountModels/Mistral-7b-FFT-Test3](https://huggingface.co/Dans-DiscountModels/Mistral-7b-FFT-Test3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Dans-DiscountModels__Mistral-7b-FFT-Test3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T17:11:54.566247](https://huggingface.co/datasets/open-llm-leaderboard/details_Dans-DiscountModels__Mistral-7b-FFT-Test3/blob/main/results_2023-12-24T17-11-54.566247.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6179065360814351,
"acc_stderr": 0.032606828195602885,
"acc_norm": 0.6250778387796089,
"acc_norm_stderr": 0.033287207958457146,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.4435511513265938,
"mc2_stderr": 0.014192712272717498
},
"harness|arc:challenge|25": {
"acc": 0.5691126279863481,
"acc_stderr": 0.014471133392642473,
"acc_norm": 0.6023890784982935,
"acc_norm_stderr": 0.01430175222327954
},
"harness|hellaswag|10": {
"acc": 0.6237801234813782,
"acc_stderr": 0.004834461997944859,
"acc_norm": 0.8236407090221072,
"acc_norm_stderr": 0.003803466456054475
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851112,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851112
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7258064516129032,
"acc_stderr": 0.0253781399708852,
"acc_norm": 0.7258064516129032,
"acc_norm_stderr": 0.0253781399708852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.617948717948718,
"acc_stderr": 0.024635549163908237,
"acc_norm": 0.617948717948718,
"acc_norm_stderr": 0.024635549163908237
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206865,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206865
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295838,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295838
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02886743144984932,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02886743144984932
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462471,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462471
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7943805874840357,
"acc_stderr": 0.01445250045678583,
"acc_norm": 0.7943805874840357,
"acc_norm_stderr": 0.01445250045678583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30614525139664805,
"acc_stderr": 0.015414494487903219,
"acc_norm": 0.30614525139664805,
"acc_norm_stderr": 0.015414494487903219
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218894,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218894
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44002607561929596,
"acc_stderr": 0.012678037478574516,
"acc_norm": 0.44002607561929596,
"acc_norm_stderr": 0.012678037478574516
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.02909720956841195,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.02909720956841195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696637,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696637
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.02740385941078685,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.02740385941078685
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.4435511513265938,
"mc2_stderr": 0.014192712272717498
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497813
},
"harness|gsm8k|5": {
"acc": 0.2676269901440485,
"acc_stderr": 0.012194764427053344
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Dans-DiscountModels__Mistral-7b-FFT-Test3 | [
"region:us"
] | 2023-12-23T16:41:49+00:00 | {"pretty_name": "Evaluation run of Dans-DiscountModels/Mistral-7b-FFT-Test3", "dataset_summary": "Dataset automatically created during the evaluation run of model [Dans-DiscountModels/Mistral-7b-FFT-Test3](https://huggingface.co/Dans-DiscountModels/Mistral-7b-FFT-Test3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Dans-DiscountModels__Mistral-7b-FFT-Test3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-24T17:11:54.566247](https://huggingface.co/datasets/open-llm-leaderboard/details_Dans-DiscountModels__Mistral-7b-FFT-Test3/blob/main/results_2023-12-24T17-11-54.566247.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6179065360814351,\n \"acc_stderr\": 0.032606828195602885,\n \"acc_norm\": 0.6250778387796089,\n \"acc_norm_stderr\": 0.033287207958457146,\n \"mc1\": 0.2913096695226438,\n \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.4435511513265938,\n \"mc2_stderr\": 0.014192712272717498\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5691126279863481,\n \"acc_stderr\": 0.014471133392642473,\n \"acc_norm\": 0.6023890784982935,\n \"acc_norm_stderr\": 0.01430175222327954\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6237801234813782,\n \"acc_stderr\": 0.004834461997944859,\n \"acc_norm\": 0.8236407090221072,\n \"acc_norm_stderr\": 0.003803466456054475\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851112,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851112\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7258064516129032,\n \"acc_stderr\": 0.0253781399708852,\n \"acc_norm\": 0.7258064516129032,\n \"acc_norm_stderr\": 0.0253781399708852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.617948717948718,\n \"acc_stderr\": 0.024635549163908237,\n \"acc_norm\": 0.617948717948718,\n \"acc_norm_stderr\": 0.024635549163908237\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295838,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295838\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.02886743144984932,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02886743144984932\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462471,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462471\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7943805874840357,\n \"acc_stderr\": 0.01445250045678583,\n \"acc_norm\": 0.7943805874840357,\n \"acc_norm_stderr\": 0.01445250045678583\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30614525139664805,\n \"acc_stderr\": 0.015414494487903219,\n \"acc_norm\": 0.30614525139664805,\n \"acc_norm_stderr\": 0.015414494487903219\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218894,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218894\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44002607561929596,\n \"acc_stderr\": 0.012678037478574516,\n \"acc_norm\": 0.44002607561929596,\n \"acc_norm_stderr\": 0.012678037478574516\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.02909720956841195,\n \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.02909720956841195\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696637,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696637\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.02740385941078685,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.02740385941078685\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.4435511513265938,\n \"mc2_stderr\": 0.014192712272717498\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497813\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2676269901440485,\n \"acc_stderr\": 0.012194764427053344\n }\n}\n```", "repo_url": "https://huggingface.co/Dans-DiscountModels/Mistral-7b-FFT-Test3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|arc:challenge|25_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|gsm8k|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hellaswag|10_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-39-31.153080.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T17-11-54.566247.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["**/details_harness|winogrande|5_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["**/details_harness|winogrande|5_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-24T17-11-54.566247.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T16_39_31.153080", "path": ["results_2023-12-23T16-39-31.153080.parquet"]}, {"split": "2023_12_24T17_11_54.566247", "path": ["results_2023-12-24T17-11-54.566247.parquet"]}, {"split": "latest", "path": ["results_2023-12-24T17-11-54.566247.parquet"]}]}]} | 2023-12-24T17:14:34+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Dans-DiscountModels/Mistral-7b-FFT-Test3
Dataset automatically created during the evaluation run of model Dans-DiscountModels/Mistral-7b-FFT-Test3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-24T17:11:54.566247(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Dans-DiscountModels/Mistral-7b-FFT-Test3\n\n\n\nDataset automatically created during the evaluation run of model Dans-DiscountModels/Mistral-7b-FFT-Test3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T17:11:54.566247(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Dans-DiscountModels/Mistral-7b-FFT-Test3\n\n\n\nDataset automatically created during the evaluation run of model Dans-DiscountModels/Mistral-7b-FFT-Test3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T17:11:54.566247(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Dans-DiscountModels/Mistral-7b-FFT-Test3\n\n\n\nDataset automatically created during the evaluation run of model Dans-DiscountModels/Mistral-7b-FFT-Test3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-24T17:11:54.566247(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
f91121b898aca1fafcd4ac5d62f41a22a0a922de |
# Norsk Bokmål NLI dataset
Machine translation of MNLI and SNLI to Bokmål.
Based on tollefj/all-nli-NOB, but all neutral examples are removed, test-train split is done,
and entailment is mapped to 1, while contradiction is mapped to 0. This is done so that we can use AnglE Training on the dataset. | kardosdrur/nb-nli | [
"region:us"
] | 2023-12-23T16:56:03+00:00 | {"dataset_info": {"features": [{"name": "premise", "dtype": "string"}, {"name": "hypothesis", "dtype": "string"}, {"name": "label", "dtype": "int64"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 74977588.8, "num_examples": 502724}, {"name": "test", "num_bytes": 18744397.2, "num_examples": 125681}], "download_size": 58272954, "dataset_size": 93721986.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2023-12-23T17:00:22+00:00 | [] | [] | TAGS
#region-us
|
# Norsk Bokmål NLI dataset
Machine translation of MNLI and SNLI to Bokmål.
Based on tollefj/all-nli-NOB, but all neutral examples are removed, test-train split is done,
and entailment is mapped to 1, while contradiction is mapped to 0. This is done so that we can use AnglE Training on the dataset. | [
"# Norsk Bokmål NLI dataset\n\nMachine translation of MNLI and SNLI to Bokmål.\nBased on tollefj/all-nli-NOB, but all neutral examples are removed, test-train split is done,\nand entailment is mapped to 1, while contradiction is mapped to 0. This is done so that we can use AnglE Training on the dataset."
] | [
"TAGS\n#region-us \n",
"# Norsk Bokmål NLI dataset\n\nMachine translation of MNLI and SNLI to Bokmål.\nBased on tollefj/all-nli-NOB, but all neutral examples are removed, test-train split is done,\nand entailment is mapped to 1, while contradiction is mapped to 0. This is done so that we can use AnglE Training on the dataset."
] | [
6,
89
] | [
"passage: TAGS\n#region-us \n# Norsk Bokmål NLI dataset\n\nMachine translation of MNLI and SNLI to Bokmål.\nBased on tollefj/all-nli-NOB, but all neutral examples are removed, test-train split is done,\nand entailment is mapped to 1, while contradiction is mapped to 0. This is done so that we can use AnglE Training on the dataset."
] |
f66ced7e7bb42c73b95bb8368296b65137d09554 |
# Dataset Card for Evaluation run of Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct](https://huggingface.co/Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T16:55:01.684484](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct/blob/main/results_2023-12-23T16-55-01.684484.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6653838410064873,
"acc_stderr": 0.031640270521971985,
"acc_norm": 0.6660954003934071,
"acc_norm_stderr": 0.03228645429155969,
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314743,
"mc2": 0.7180055234145617,
"mc2_stderr": 0.015031705179783715
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.013582571095815291,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.013273077865907595
},
"harness|hellaswag|10": {
"acc": 0.7124078868751245,
"acc_stderr": 0.004517148434180491,
"acc_norm": 0.8829914359689305,
"acc_norm_stderr": 0.0032077357692780416
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.02574806587167328,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.02574806587167328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644244,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.02921354941437217,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.02921354941437217
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590177,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590177
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424383,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424383
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39106145251396646,
"acc_stderr": 0.016320763763808383,
"acc_norm": 0.39106145251396646,
"acc_norm_stderr": 0.016320763763808383
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179615,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179615
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262192,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262192
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49282920469361147,
"acc_stderr": 0.012768922739553308,
"acc_norm": 0.49282920469361147,
"acc_norm_stderr": 0.012768922739553308
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.026556519470041513,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.026556519470041513
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.01882421951270621,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.01882421951270621
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314743,
"mc2": 0.7180055234145617,
"mc2_stderr": 0.015031705179783715
},
"harness|winogrande|5": {
"acc": 0.8374112075769534,
"acc_stderr": 0.010370455551343338
},
"harness|gsm8k|5": {
"acc": 0.6467020470053071,
"acc_stderr": 0.013166337192115683
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct | [
"region:us"
] | 2023-12-23T16:57:18+00:00 | {"pretty_name": "Evaluation run of Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct](https://huggingface.co/Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T16:55:01.684484](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct/blob/main/results_2023-12-23T16-55-01.684484.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6653838410064873,\n \"acc_stderr\": 0.031640270521971985,\n \"acc_norm\": 0.6660954003934071,\n \"acc_norm_stderr\": 0.03228645429155969,\n \"mc1\": 0.5716034271725826,\n \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.7180055234145617,\n \"mc2_stderr\": 0.015031705179783715\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.013582571095815291,\n \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.013273077865907595\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7124078868751245,\n \"acc_stderr\": 0.004517148434180491,\n \"acc_norm\": 0.8829914359689305,\n \"acc_norm_stderr\": 0.0032077357692780416\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.02574806587167328,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.02574806587167328\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n \"acc_stderr\": 0.022185710092252252,\n \"acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.022185710092252252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.02921354941437217,\n \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.02921354941437217\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590177,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590177\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.023357365785874037,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.023357365785874037\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39106145251396646,\n \"acc_stderr\": 0.016320763763808383,\n \"acc_norm\": 0.39106145251396646,\n \"acc_norm_stderr\": 0.016320763763808383\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.025403832978179615,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.025403832978179615\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262192,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262192\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49282920469361147,\n \"acc_stderr\": 0.012768922739553308,\n \"acc_norm\": 0.49282920469361147,\n \"acc_norm_stderr\": 0.012768922739553308\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.026556519470041513,\n \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.026556519470041513\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.01882421951270621,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.01882421951270621\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5716034271725826,\n \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.7180055234145617,\n \"mc2_stderr\": 0.015031705179783715\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.010370455551343338\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6467020470053071,\n \"acc_stderr\": 0.013166337192115683\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-55-01.684484.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["**/details_harness|winogrande|5_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T16-55-01.684484.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T16_55_01.684484", "path": ["results_2023-12-23T16-55-01.684484.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T16-55-01.684484.parquet"]}]}]} | 2023-12-23T16:57:41+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct
Dataset automatically created during the evaluation run of model Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T16:55:01.684484(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:55:01.684484(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:55:01.684484(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T16:55:01.684484(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
30c20095dfacec22bbae687d0b889c3ce53bccc4 |
# Dataset Card for Evaluation run of Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test](https://huggingface.co/Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T16:56:58.470467](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct-test/blob/main/results_2023-12-23T16-56-58.470467.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6653838410064873,
"acc_stderr": 0.031640270521971985,
"acc_norm": 0.6660954003934071,
"acc_norm_stderr": 0.03228645429155969,
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314743,
"mc2": 0.7180055234145617,
"mc2_stderr": 0.015031705179783715
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.013582571095815291,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.013273077865907595
},
"harness|hellaswag|10": {
"acc": 0.7124078868751245,
"acc_stderr": 0.004517148434180491,
"acc_norm": 0.8829914359689305,
"acc_norm_stderr": 0.0032077357692780416
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.02574806587167328,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.02574806587167328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644244,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.02921354941437217,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.02921354941437217
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590177,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590177
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424383,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424383
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39106145251396646,
"acc_stderr": 0.016320763763808383,
"acc_norm": 0.39106145251396646,
"acc_norm_stderr": 0.016320763763808383
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179615,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179615
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262192,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262192
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49282920469361147,
"acc_stderr": 0.012768922739553308,
"acc_norm": 0.49282920469361147,
"acc_norm_stderr": 0.012768922739553308
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.026556519470041513,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.026556519470041513
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.01882421951270621,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.01882421951270621
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314743,
"mc2": 0.7180055234145617,
"mc2_stderr": 0.015031705179783715
},
"harness|winogrande|5": {
"acc": 0.8374112075769534,
"acc_stderr": 0.010370455551343338
},
"harness|gsm8k|5": {
"acc": 0.6467020470053071,
"acc_stderr": 0.013166337192115683
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct-test | [
"region:us"
] | 2023-12-23T16:59:16+00:00 | {"pretty_name": "Evaluation run of Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test](https://huggingface.co/Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct-test\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T16:56:58.470467](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct-test/blob/main/results_2023-12-23T16-56-58.470467.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6653838410064873,\n \"acc_stderr\": 0.031640270521971985,\n \"acc_norm\": 0.6660954003934071,\n \"acc_norm_stderr\": 0.03228645429155969,\n \"mc1\": 0.5716034271725826,\n \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.7180055234145617,\n \"mc2_stderr\": 0.015031705179783715\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.013582571095815291,\n \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.013273077865907595\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7124078868751245,\n \"acc_stderr\": 0.004517148434180491,\n \"acc_norm\": 0.8829914359689305,\n \"acc_norm_stderr\": 0.0032077357692780416\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.02574806587167328,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.02574806587167328\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n \"acc_stderr\": 0.022185710092252252,\n \"acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.022185710092252252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.02921354941437217,\n \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.02921354941437217\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590177,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590177\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.023357365785874037,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.023357365785874037\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39106145251396646,\n \"acc_stderr\": 0.016320763763808383,\n \"acc_norm\": 0.39106145251396646,\n \"acc_norm_stderr\": 0.016320763763808383\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.025403832978179615,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.025403832978179615\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262192,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262192\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49282920469361147,\n \"acc_stderr\": 0.012768922739553308,\n \"acc_norm\": 0.49282920469361147,\n \"acc_norm_stderr\": 0.012768922739553308\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.026556519470041513,\n \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.026556519470041513\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.01882421951270621,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.01882421951270621\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5716034271725826,\n \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.7180055234145617,\n \"mc2_stderr\": 0.015031705179783715\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.010370455551343338\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6467020470053071,\n \"acc_stderr\": 0.013166337192115683\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-56-58.470467.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["**/details_harness|winogrande|5_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T16-56-58.470467.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T16_56_58.470467", "path": ["results_2023-12-23T16-56-58.470467.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T16-56-58.470467.parquet"]}]}]} | 2023-12-23T16:59:37+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test
Dataset automatically created during the evaluation run of model Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T16:56:58.470467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:56:58.470467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:56:58.470467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
197,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T16:56:58.470467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
34165a7ecbcee07103b9ac911cdaa25736dc5bd0 |
# Dataset Card for Evaluation run of luffycodes/vicuna-class-shishya-13b-ep3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [luffycodes/vicuna-class-shishya-13b-ep3](https://huggingface.co/luffycodes/vicuna-class-shishya-13b-ep3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-13b-ep3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T16:58:02.856769](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-13b-ep3/blob/main/results_2023-12-23T16-58-02.856769.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5621545466705268,
"acc_stderr": 0.03351539520737431,
"acc_norm": 0.5727655950528345,
"acc_norm_stderr": 0.03442395762278095,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871105,
"mc2": 0.35003126952306707,
"mc2_stderr": 0.014347219852780793
},
"harness|arc:challenge|25": {
"acc": 0.43856655290102387,
"acc_stderr": 0.014500682618212865,
"acc_norm": 0.46501706484641636,
"acc_norm_stderr": 0.01457558392201966
},
"harness|hellaswag|10": {
"acc": 0.6061541525592511,
"acc_stderr": 0.004876028037941937,
"acc_norm": 0.8036247759410476,
"acc_norm_stderr": 0.003964437012249992
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.043036840335373146,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.043036840335373146
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819067,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819067
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.02716253782694846,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.02716253782694846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562427,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562427
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.035679697722680495,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.035679697722680495
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.03289477330098617,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.03289477330098617
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.028408953626245265,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.028408953626245265
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.02498535492310233,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.02498535492310233
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413926,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7559633027522936,
"acc_stderr": 0.0184152863514164,
"acc_norm": 0.7559633027522936,
"acc_norm_stderr": 0.0184152863514164
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.02798569938703642,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.02798569938703642
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543678,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543678
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7535121328224776,
"acc_stderr": 0.015411308769686936,
"acc_norm": 0.7535121328224776,
"acc_norm_stderr": 0.015411308769686936
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.026296227915613663,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.026296227915613663
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33519553072625696,
"acc_stderr": 0.01578800719018588,
"acc_norm": 0.33519553072625696,
"acc_norm_stderr": 0.01578800719018588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159617,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159617
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971635,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971635
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.026869490744815247,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.026869490744815247
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.02955545423677886,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.02955545423677886
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.423728813559322,
"acc_stderr": 0.01262078515588599,
"acc_norm": 0.423728813559322,
"acc_norm_stderr": 0.01262078515588599
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.03035230339535196,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.03035230339535196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.019955975145835542,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.019955975145835542
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.03055531675557364,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.03055531675557364
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772436,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772436
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871105,
"mc2": 0.35003126952306707,
"mc2_stderr": 0.014347219852780793
},
"harness|winogrande|5": {
"acc": 0.7221783741120757,
"acc_stderr": 0.012588918183871596
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-13b-ep3 | [
"region:us"
] | 2023-12-23T17:00:22+00:00 | {"pretty_name": "Evaluation run of luffycodes/vicuna-class-shishya-13b-ep3", "dataset_summary": "Dataset automatically created during the evaluation run of model [luffycodes/vicuna-class-shishya-13b-ep3](https://huggingface.co/luffycodes/vicuna-class-shishya-13b-ep3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-13b-ep3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T16:58:02.856769](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-13b-ep3/blob/main/results_2023-12-23T16-58-02.856769.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5621545466705268,\n \"acc_stderr\": 0.03351539520737431,\n \"acc_norm\": 0.5727655950528345,\n \"acc_norm_stderr\": 0.03442395762278095,\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871105,\n \"mc2\": 0.35003126952306707,\n \"mc2_stderr\": 0.014347219852780793\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.43856655290102387,\n \"acc_stderr\": 0.014500682618212865,\n \"acc_norm\": 0.46501706484641636,\n \"acc_norm_stderr\": 0.01457558392201966\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6061541525592511,\n \"acc_stderr\": 0.004876028037941937,\n \"acc_norm\": 0.8036247759410476,\n \"acc_norm_stderr\": 0.003964437012249992\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108102,\n \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108102\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.043036840335373146,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.043036840335373146\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819067,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819067\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n \"acc_stderr\": 0.02716253782694846,\n \"acc_norm\": 0.6483870967741936,\n \"acc_norm_stderr\": 0.02716253782694846\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562427,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562427\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.035679697722680495,\n \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.035679697722680495\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6919191919191919,\n \"acc_stderr\": 0.03289477330098617,\n \"acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.03289477330098617\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.028408953626245265,\n \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.028408953626245265\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.02498535492310233,\n \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.02498535492310233\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7559633027522936,\n \"acc_stderr\": 0.0184152863514164,\n \"acc_norm\": 0.7559633027522936,\n \"acc_norm_stderr\": 0.0184152863514164\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501947,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501947\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.02798569938703642,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.02798569938703642\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.024414947304543678,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.024414947304543678\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n \"acc_stderr\": 0.015411308769686936,\n \"acc_norm\": 0.7535121328224776,\n \"acc_norm_stderr\": 0.015411308769686936\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.026296227915613663,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.026296227915613663\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33519553072625696,\n \"acc_stderr\": 0.01578800719018588,\n \"acc_norm\": 0.33519553072625696,\n \"acc_norm_stderr\": 0.01578800719018588\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159617,\n \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159617\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n \"acc_stderr\": 0.027368078243971635,\n \"acc_norm\": 0.6334405144694534,\n \"acc_norm_stderr\": 0.027368078243971635\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.026869490744815247,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.026869490744815247\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4326241134751773,\n \"acc_stderr\": 0.02955545423677886,\n \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.02955545423677886\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.423728813559322,\n \"acc_stderr\": 0.01262078515588599,\n \"acc_norm\": 0.423728813559322,\n \"acc_norm_stderr\": 0.01262078515588599\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.03035230339535196,\n \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.03035230339535196\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.019955975145835542,\n \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.019955975145835542\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.03055531675557364,\n \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.03055531675557364\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n \"acc_stderr\": 0.029705284056772436,\n \"acc_norm\": 0.7711442786069652,\n \"acc_norm_stderr\": 0.029705284056772436\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871105,\n \"mc2\": 0.35003126952306707,\n \"mc2_stderr\": 0.014347219852780793\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7221783741120757,\n \"acc_stderr\": 0.012588918183871596\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/luffycodes/vicuna-class-shishya-13b-ep3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-58-02.856769.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["**/details_harness|winogrande|5_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T16-58-02.856769.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T16_58_02.856769", "path": ["results_2023-12-23T16-58-02.856769.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T16-58-02.856769.parquet"]}]}]} | 2023-12-23T17:00:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of luffycodes/vicuna-class-shishya-13b-ep3
Dataset automatically created during the evaluation run of model luffycodes/vicuna-class-shishya-13b-ep3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T16:58:02.856769(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of luffycodes/vicuna-class-shishya-13b-ep3\n\n\n\nDataset automatically created during the evaluation run of model luffycodes/vicuna-class-shishya-13b-ep3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:58:02.856769(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of luffycodes/vicuna-class-shishya-13b-ep3\n\n\n\nDataset automatically created during the evaluation run of model luffycodes/vicuna-class-shishya-13b-ep3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:58:02.856769(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of luffycodes/vicuna-class-shishya-13b-ep3\n\n\n\nDataset automatically created during the evaluation run of model luffycodes/vicuna-class-shishya-13b-ep3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T16:58:02.856769(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
1f4a0321cb2c1fe7ae6b8fabe3c77beb143dff9d |
# Dataset Card for Evaluation run of Zangs3011/falcon_7b_DolphinCoder
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Zangs3011/falcon_7b_DolphinCoder](https://huggingface.co/Zangs3011/falcon_7b_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Zangs3011__falcon_7b_DolphinCoder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T16:59:18.964437](https://huggingface.co/datasets/open-llm-leaderboard/details_Zangs3011__falcon_7b_DolphinCoder/blob/main/results_2023-12-23T16-59-18.964437.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.28247468876632786,
"acc_stderr": 0.03156556817285131,
"acc_norm": 0.28312300073590296,
"acc_norm_stderr": 0.032303512019122835,
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023493,
"mc2": 0.35117580451709535,
"mc2_stderr": 0.013551047154306205
},
"harness|arc:challenge|25": {
"acc": 0.45307167235494883,
"acc_stderr": 0.014546892052005631,
"acc_norm": 0.4872013651877133,
"acc_norm_stderr": 0.014606603181012538
},
"harness|hellaswag|10": {
"acc": 0.5855407289384584,
"acc_stderr": 0.004916216503770337,
"acc_norm": 0.7803226448914559,
"acc_norm_stderr": 0.004131818797713878
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03820169914517905,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03820169914517905
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2565789473684211,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.2565789473684211,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.0277242364927009,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.0277242364927009
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2013888888888889,
"acc_stderr": 0.033536474697138406,
"acc_norm": 0.2013888888888889,
"acc_norm_stderr": 0.033536474697138406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28936170212765955,
"acc_stderr": 0.02964400657700962,
"acc_norm": 0.28936170212765955,
"acc_norm_stderr": 0.02964400657700962
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03855289616378949,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03855289616378949
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.22486772486772486,
"acc_stderr": 0.021502096078229147,
"acc_norm": 0.22486772486772486,
"acc_norm_stderr": 0.021502096078229147
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924317,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924317
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.0319474007226554,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.0319474007226554
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.19696969696969696,
"acc_stderr": 0.028335609732463345,
"acc_norm": 0.19696969696969696,
"acc_norm_stderr": 0.028335609732463345
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752937,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752937
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24871794871794872,
"acc_stderr": 0.0219169577092138,
"acc_norm": 0.24871794871794872,
"acc_norm_stderr": 0.0219169577092138
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712177,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712177
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2605042016806723,
"acc_stderr": 0.028510251512341933,
"acc_norm": 0.2605042016806723,
"acc_norm_stderr": 0.028510251512341933
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24036697247706423,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.24036697247706423,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.026991454502036733,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.026991454502036733
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29901960784313725,
"acc_stderr": 0.032133257173736156,
"acc_norm": 0.29901960784313725,
"acc_norm_stderr": 0.032133257173736156
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.36771300448430494,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.36771300448430494,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.32061068702290074,
"acc_stderr": 0.040933292298342784,
"acc_norm": 0.32061068702290074,
"acc_norm_stderr": 0.040933292298342784
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.04524596007030048,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.04524596007030048
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3247863247863248,
"acc_stderr": 0.030679022765498835,
"acc_norm": 0.3247863247863248,
"acc_norm_stderr": 0.030679022765498835
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.01598281477469563,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.01598281477469563
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2947976878612717,
"acc_stderr": 0.02454761779480383,
"acc_norm": 0.2947976878612717,
"acc_norm_stderr": 0.02454761779480383
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767857,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767857
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2908496732026144,
"acc_stderr": 0.026004800363952113,
"acc_norm": 0.2908496732026144,
"acc_norm_stderr": 0.026004800363952113
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.22508038585209003,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.22508038585209003,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.30246913580246915,
"acc_stderr": 0.02555765398186805,
"acc_norm": 0.30246913580246915,
"acc_norm_stderr": 0.02555765398186805
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.026684564340461004,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.026684564340461004
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2405475880052151,
"acc_stderr": 0.01091640673547895,
"acc_norm": 0.2405475880052151,
"acc_norm_stderr": 0.01091640673547895
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2757352941176471,
"acc_stderr": 0.02714627193662517,
"acc_norm": 0.2757352941176471,
"acc_norm_stderr": 0.02714627193662517
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.017952449196987862,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.017952449196987862
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3283582089552239,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.3283582089552239,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3373493975903614,
"acc_stderr": 0.03680783690727581,
"acc_norm": 0.3373493975903614,
"acc_norm_stderr": 0.03680783690727581
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.34502923976608185,
"acc_stderr": 0.036459813773888065,
"acc_norm": 0.34502923976608185,
"acc_norm_stderr": 0.036459813773888065
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023493,
"mc2": 0.35117580451709535,
"mc2_stderr": 0.013551047154306205
},
"harness|winogrande|5": {
"acc": 0.7048145224940805,
"acc_stderr": 0.012819410741754765
},
"harness|gsm8k|5": {
"acc": 0.05079605761940864,
"acc_stderr": 0.006048352096878086
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Zangs3011__falcon_7b_DolphinCoder | [
"region:us"
] | 2023-12-23T17:01:01+00:00 | {"pretty_name": "Evaluation run of Zangs3011/falcon_7b_DolphinCoder", "dataset_summary": "Dataset automatically created during the evaluation run of model [Zangs3011/falcon_7b_DolphinCoder](https://huggingface.co/Zangs3011/falcon_7b_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Zangs3011__falcon_7b_DolphinCoder\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T16:59:18.964437](https://huggingface.co/datasets/open-llm-leaderboard/details_Zangs3011__falcon_7b_DolphinCoder/blob/main/results_2023-12-23T16-59-18.964437.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.28247468876632786,\n \"acc_stderr\": 0.03156556817285131,\n \"acc_norm\": 0.28312300073590296,\n \"acc_norm_stderr\": 0.032303512019122835,\n \"mc1\": 0.2252141982864137,\n \"mc1_stderr\": 0.014623240768023493,\n \"mc2\": 0.35117580451709535,\n \"mc2_stderr\": 0.013551047154306205\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.45307167235494883,\n \"acc_stderr\": 0.014546892052005631,\n \"acc_norm\": 0.4872013651877133,\n \"acc_norm_stderr\": 0.014606603181012538\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5855407289384584,\n \"acc_stderr\": 0.004916216503770337,\n \"acc_norm\": 0.7803226448914559,\n \"acc_norm_stderr\": 0.004131818797713878\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03820169914517905,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03820169914517905\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2565789473684211,\n \"acc_stderr\": 0.0355418036802569,\n \"acc_norm\": 0.2565789473684211,\n \"acc_norm_stderr\": 0.0355418036802569\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.0277242364927009,\n \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.0277242364927009\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2013888888888889,\n \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.2013888888888889,\n \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.28936170212765955,\n \"acc_stderr\": 0.02964400657700962,\n \"acc_norm\": 0.28936170212765955,\n \"acc_norm_stderr\": 0.02964400657700962\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03855289616378949,\n \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03855289616378949\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.22486772486772486,\n \"acc_stderr\": 0.021502096078229147,\n \"acc_norm\": 0.22486772486772486,\n \"acc_norm_stderr\": 0.021502096078229147\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.03852273364924317,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.03852273364924317\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.24838709677419354,\n \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.0319474007226554,\n \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.0319474007226554\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.19696969696969696,\n \"acc_stderr\": 0.028335609732463345,\n \"acc_norm\": 0.19696969696969696,\n \"acc_norm_stderr\": 0.028335609732463345\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752937,\n \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752937\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.24871794871794872,\n \"acc_stderr\": 0.0219169577092138,\n \"acc_norm\": 0.24871794871794872,\n \"acc_norm_stderr\": 0.0219169577092138\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712177,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712177\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341933,\n \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341933\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24036697247706423,\n \"acc_stderr\": 0.01832060732096407,\n \"acc_norm\": 0.24036697247706423,\n \"acc_norm_stderr\": 0.01832060732096407\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.19444444444444445,\n \"acc_stderr\": 0.026991454502036733,\n \"acc_norm\": 0.19444444444444445,\n \"acc_norm_stderr\": 0.026991454502036733\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.29901960784313725,\n \"acc_stderr\": 0.032133257173736156,\n \"acc_norm\": 0.29901960784313725,\n \"acc_norm_stderr\": 0.032133257173736156\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.36771300448430494,\n \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.36771300448430494,\n \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.32061068702290074,\n \"acc_stderr\": 0.040933292298342784,\n \"acc_norm\": 0.32061068702290074,\n \"acc_norm_stderr\": 0.040933292298342784\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3247863247863248,\n \"acc_stderr\": 0.030679022765498835,\n \"acc_norm\": 0.3247863247863248,\n \"acc_norm_stderr\": 0.030679022765498835\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.01598281477469563,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.01598281477469563\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2947976878612717,\n \"acc_stderr\": 0.02454761779480383,\n \"acc_norm\": 0.2947976878612717,\n \"acc_norm_stderr\": 0.02454761779480383\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.014355911964767857,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.014355911964767857\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2908496732026144,\n \"acc_stderr\": 0.026004800363952113,\n \"acc_norm\": 0.2908496732026144,\n \"acc_norm_stderr\": 0.026004800363952113\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.22508038585209003,\n \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.22508038585209003,\n \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.30246913580246915,\n \"acc_stderr\": 0.02555765398186805,\n \"acc_norm\": 0.30246913580246915,\n \"acc_norm_stderr\": 0.02555765398186805\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340461004,\n \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340461004\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2405475880052151,\n \"acc_stderr\": 0.01091640673547895,\n \"acc_norm\": 0.2405475880052151,\n \"acc_norm_stderr\": 0.01091640673547895\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2757352941176471,\n \"acc_stderr\": 0.02714627193662517,\n \"acc_norm\": 0.2757352941176471,\n \"acc_norm_stderr\": 0.02714627193662517\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2696078431372549,\n \"acc_stderr\": 0.017952449196987862,\n \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.017952449196987862\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3283582089552239,\n \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.3283582089552239,\n \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n \"acc_stderr\": 0.03680783690727581,\n \"acc_norm\": 0.3373493975903614,\n \"acc_norm_stderr\": 0.03680783690727581\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.34502923976608185,\n \"acc_stderr\": 0.036459813773888065,\n \"acc_norm\": 0.34502923976608185,\n \"acc_norm_stderr\": 0.036459813773888065\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2252141982864137,\n \"mc1_stderr\": 0.014623240768023493,\n \"mc2\": 0.35117580451709535,\n \"mc2_stderr\": 0.013551047154306205\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7048145224940805,\n \"acc_stderr\": 0.012819410741754765\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05079605761940864,\n \"acc_stderr\": 0.006048352096878086\n }\n}\n```", "repo_url": "https://huggingface.co/Zangs3011/falcon_7b_DolphinCoder", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-59-18.964437.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["**/details_harness|winogrande|5_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T16-59-18.964437.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T16_59_18.964437", "path": ["results_2023-12-23T16-59-18.964437.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T16-59-18.964437.parquet"]}]}]} | 2023-12-23T17:01:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Zangs3011/falcon_7b_DolphinCoder
Dataset automatically created during the evaluation run of model Zangs3011/falcon_7b_DolphinCoder on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T16:59:18.964437(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Zangs3011/falcon_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model Zangs3011/falcon_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:59:18.964437(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Zangs3011/falcon_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model Zangs3011/falcon_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:59:18.964437(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Zangs3011/falcon_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model Zangs3011/falcon_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T16:59:18.964437(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
77cf1922facd54e1ae4bead9a5f0b9ed9a49733d | # Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | shubnandi/imdb_small | [
"region:us"
] | 2023-12-23T17:01:02+00:00 | {} | 2023-12-23T17:50:03+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
34,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
e615f79bcb04f7f66f4ec3f854650e009cf59925 |
# Dataset Card for Evaluation run of OpenPipe/mistral-ft-optimized-1218
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OpenPipe/mistral-ft-optimized-1218](https://huggingface.co/OpenPipe/mistral-ft-optimized-1218) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenPipe__mistral-ft-optimized-1218",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T16:59:03.056117](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenPipe__mistral-ft-optimized-1218/blob/main/results_2023-12-23T16-59-03.056117.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6540752717223282,
"acc_stderr": 0.03195973524820356,
"acc_norm": 0.6539909026028121,
"acc_norm_stderr": 0.03262037928018462,
"mc1": 0.43084455324357407,
"mc1_stderr": 0.017335272475332366,
"mc2": 0.5947867444067919,
"mc2_stderr": 0.015138536405992413
},
"harness|arc:challenge|25": {
"acc": 0.6518771331058021,
"acc_stderr": 0.01392100859517934,
"acc_norm": 0.6791808873720137,
"acc_norm_stderr": 0.013640943091946533
},
"harness|hellaswag|10": {
"acc": 0.6730730930093607,
"acc_stderr": 0.004681316064444416,
"acc_norm": 0.8625771758613822,
"acc_norm_stderr": 0.0034358953866922546
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.032081157507886836,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.032081157507886836
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8587155963302753,
"acc_stderr": 0.014933868987028075,
"acc_norm": 0.8587155963302753,
"acc_norm_stderr": 0.014933868987028075
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3854748603351955,
"acc_stderr": 0.01627792703963819,
"acc_norm": 0.3854748603351955,
"acc_norm_stderr": 0.01627792703963819
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.024954184324879905,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.024954184324879905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43084455324357407,
"mc1_stderr": 0.017335272475332366,
"mc2": 0.5947867444067919,
"mc2_stderr": 0.015138536405992413
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.011082538847491906
},
"harness|gsm8k|5": {
"acc": 0.7225170583775588,
"acc_stderr": 0.01233344758104755
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_OpenPipe__mistral-ft-optimized-1218 | [
"region:us"
] | 2023-12-23T17:01:23+00:00 | {"pretty_name": "Evaluation run of OpenPipe/mistral-ft-optimized-1218", "dataset_summary": "Dataset automatically created during the evaluation run of model [OpenPipe/mistral-ft-optimized-1218](https://huggingface.co/OpenPipe/mistral-ft-optimized-1218) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenPipe__mistral-ft-optimized-1218\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T16:59:03.056117](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenPipe__mistral-ft-optimized-1218/blob/main/results_2023-12-23T16-59-03.056117.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6540752717223282,\n \"acc_stderr\": 0.03195973524820356,\n \"acc_norm\": 0.6539909026028121,\n \"acc_norm_stderr\": 0.03262037928018462,\n \"mc1\": 0.43084455324357407,\n \"mc1_stderr\": 0.017335272475332366,\n \"mc2\": 0.5947867444067919,\n \"mc2_stderr\": 0.015138536405992413\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6518771331058021,\n \"acc_stderr\": 0.01392100859517934,\n \"acc_norm\": 0.6791808873720137,\n \"acc_norm_stderr\": 0.013640943091946533\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6730730930093607,\n \"acc_stderr\": 0.004681316064444416,\n \"acc_norm\": 0.8625771758613822,\n \"acc_norm_stderr\": 0.0034358953866922546\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.032081157507886836,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.032081157507886836\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8587155963302753,\n \"acc_stderr\": 0.014933868987028075,\n \"acc_norm\": 0.8587155963302753,\n \"acc_norm_stderr\": 0.014933868987028075\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n \"acc_stderr\": 0.01627792703963819,\n \"acc_norm\": 0.3854748603351955,\n \"acc_norm_stderr\": 0.01627792703963819\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879905,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43084455324357407,\n \"mc1_stderr\": 0.017335272475332366,\n \"mc2\": 0.5947867444067919,\n \"mc2_stderr\": 0.015138536405992413\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.011082538847491906\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7225170583775588,\n \"acc_stderr\": 0.01233344758104755\n }\n}\n```", "repo_url": "https://huggingface.co/OpenPipe/mistral-ft-optimized-1218", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T16-59-03.056117.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["**/details_harness|winogrande|5_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T16-59-03.056117.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T16_59_03.056117", "path": ["results_2023-12-23T16-59-03.056117.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T16-59-03.056117.parquet"]}]}]} | 2023-12-23T17:01:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of OpenPipe/mistral-ft-optimized-1218
Dataset automatically created during the evaluation run of model OpenPipe/mistral-ft-optimized-1218 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T16:59:03.056117(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of OpenPipe/mistral-ft-optimized-1218\n\n\n\nDataset automatically created during the evaluation run of model OpenPipe/mistral-ft-optimized-1218 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:59:03.056117(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of OpenPipe/mistral-ft-optimized-1218\n\n\n\nDataset automatically created during the evaluation run of model OpenPipe/mistral-ft-optimized-1218 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T16:59:03.056117(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of OpenPipe/mistral-ft-optimized-1218\n\n\n\nDataset automatically created during the evaluation run of model OpenPipe/mistral-ft-optimized-1218 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T16:59:03.056117(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
ec0d15dd4d3ab4e1473d4c591de88b8e23d2d2be |
# Dataset Card for Evaluation run of luffycodes/vicuna-class-shishya-ac-hal-13b-ep3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [luffycodes/vicuna-class-shishya-ac-hal-13b-ep3](https://huggingface.co/luffycodes/vicuna-class-shishya-ac-hal-13b-ep3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-ac-hal-13b-ep3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T17:04:02.519705](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-ac-hal-13b-ep3/blob/main/results_2023-12-23T17-04-02.519705.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5542525215859695,
"acc_stderr": 0.03349073743939158,
"acc_norm": 0.5645913429036128,
"acc_norm_stderr": 0.034401518822204755,
"mc1": 0.2692778457772338,
"mc1_stderr": 0.015528566637087283,
"mc2": 0.3932031444895361,
"mc2_stderr": 0.01464338174530395
},
"harness|arc:challenge|25": {
"acc": 0.44283276450511944,
"acc_stderr": 0.014515573873348897,
"acc_norm": 0.48464163822525597,
"acc_norm_stderr": 0.014604496129394908
},
"harness|hellaswag|10": {
"acc": 0.6133240390360486,
"acc_stderr": 0.004859930926500306,
"acc_norm": 0.8078072097191794,
"acc_norm_stderr": 0.003932184843841661
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621502,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621502
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.041321250197233685,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.041321250197233685
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425082,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425082
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.027327548447957536,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.027327548447957536
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876105,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.036462049632538115,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.036462049632538115
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.03332299921070643,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.03332299921070643
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5717948717948718,
"acc_stderr": 0.025088301454694834,
"acc_norm": 0.5717948717948718,
"acc_norm_stderr": 0.025088301454694834
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114982,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114982
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7302752293577982,
"acc_stderr": 0.01902848671111544,
"acc_norm": 0.7302752293577982,
"acc_norm_stderr": 0.01902848671111544
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896079,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896079
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923393,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923393
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.041184385658062976,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.041184385658062976
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690876,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690876
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.024161618127987745,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.024161618127987745
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7522349936143039,
"acc_stderr": 0.015438083080568972,
"acc_norm": 0.7522349936143039,
"acc_norm_stderr": 0.015438083080568972
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.02636243757454654,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.02636243757454654
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3407821229050279,
"acc_stderr": 0.0158520024498621,
"acc_norm": 0.3407821229050279,
"acc_norm_stderr": 0.0158520024498621
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.027684181883302895,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.027684181883302895
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971642,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971642
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.026869490744815247,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.026869490744815247
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.029316011776343555,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.029316011776343555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42503259452411996,
"acc_stderr": 0.012625879884892,
"acc_norm": 0.42503259452411996,
"acc_norm_stderr": 0.012625879884892
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5,
"acc_stderr": 0.030372836961539352,
"acc_norm": 0.5,
"acc_norm_stderr": 0.030372836961539352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.020071257886886528,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.020071257886886528
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.03071356045510849,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.03071356045510849
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014666,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014666
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2692778457772338,
"mc1_stderr": 0.015528566637087283,
"mc2": 0.3932031444895361,
"mc2_stderr": 0.01464338174530395
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.01240654946619286
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-ac-hal-13b-ep3 | [
"region:us"
] | 2023-12-23T17:06:21+00:00 | {"pretty_name": "Evaluation run of luffycodes/vicuna-class-shishya-ac-hal-13b-ep3", "dataset_summary": "Dataset automatically created during the evaluation run of model [luffycodes/vicuna-class-shishya-ac-hal-13b-ep3](https://huggingface.co/luffycodes/vicuna-class-shishya-ac-hal-13b-ep3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-ac-hal-13b-ep3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T17:04:02.519705](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-ac-hal-13b-ep3/blob/main/results_2023-12-23T17-04-02.519705.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5542525215859695,\n \"acc_stderr\": 0.03349073743939158,\n \"acc_norm\": 0.5645913429036128,\n \"acc_norm_stderr\": 0.034401518822204755,\n \"mc1\": 0.2692778457772338,\n \"mc1_stderr\": 0.015528566637087283,\n \"mc2\": 0.3932031444895361,\n \"mc2_stderr\": 0.01464338174530395\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.44283276450511944,\n \"acc_stderr\": 0.014515573873348897,\n \"acc_norm\": 0.48464163822525597,\n \"acc_norm_stderr\": 0.014604496129394908\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6133240390360486,\n \"acc_stderr\": 0.004859930926500306,\n \"acc_norm\": 0.8078072097191794,\n \"acc_norm_stderr\": 0.003932184843841661\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621502,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621502\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n \"acc_stderr\": 0.041321250197233685,\n \"acc_norm\": 0.5763888888888888,\n \"acc_norm_stderr\": 0.041321250197233685\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425082,\n \"acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425082\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n \"acc_stderr\": 0.027327548447957536,\n \"acc_norm\": 0.6387096774193548,\n \"acc_norm_stderr\": 0.027327548447957536\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876105,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538115,\n \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538115\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6767676767676768,\n \"acc_stderr\": 0.03332299921070643,\n \"acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.03332299921070643\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5717948717948718,\n \"acc_stderr\": 0.025088301454694834,\n \"acc_norm\": 0.5717948717948718,\n \"acc_norm_stderr\": 0.025088301454694834\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114982,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114982\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7302752293577982,\n \"acc_stderr\": 0.01902848671111544,\n \"acc_norm\": 0.7302752293577982,\n \"acc_norm_stderr\": 0.01902848671111544\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896079,\n \"acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896079\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923393,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923393\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.041184385658062976,\n \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.041184385658062976\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690876,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690876\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n \"acc_stderr\": 0.024161618127987745,\n \"acc_norm\": 0.8376068376068376,\n \"acc_norm_stderr\": 0.024161618127987745\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7522349936143039,\n \"acc_stderr\": 0.015438083080568972,\n \"acc_norm\": 0.7522349936143039,\n \"acc_norm_stderr\": 0.015438083080568972\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.02636243757454654,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.02636243757454654\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3407821229050279,\n \"acc_stderr\": 0.0158520024498621,\n \"acc_norm\": 0.3407821229050279,\n \"acc_norm_stderr\": 0.0158520024498621\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302895,\n \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302895\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n \"acc_stderr\": 0.027368078243971642,\n \"acc_norm\": 0.6334405144694534,\n \"acc_norm_stderr\": 0.027368078243971642\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.026869490744815247,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.026869490744815247\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4078014184397163,\n \"acc_stderr\": 0.029316011776343555,\n \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.029316011776343555\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42503259452411996,\n \"acc_stderr\": 0.012625879884892,\n \"acc_norm\": 0.42503259452411996,\n \"acc_norm_stderr\": 0.012625879884892\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.030372836961539352,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.030372836961539352\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.020071257886886528,\n \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.020071257886886528\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.03071356045510849,\n \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.03071356045510849\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n \"acc_stderr\": 0.030360490154014666,\n \"acc_norm\": 0.7562189054726368,\n \"acc_norm_stderr\": 0.030360490154014666\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2692778457772338,\n \"mc1_stderr\": 0.015528566637087283,\n \"mc2\": 0.3932031444895361,\n \"mc2_stderr\": 0.01464338174530395\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.01240654946619286\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/luffycodes/vicuna-class-shishya-ac-hal-13b-ep3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-04-02.519705.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["**/details_harness|winogrande|5_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T17-04-02.519705.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T17_04_02.519705", "path": ["results_2023-12-23T17-04-02.519705.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T17-04-02.519705.parquet"]}]}]} | 2023-12-23T17:06:42+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of luffycodes/vicuna-class-shishya-ac-hal-13b-ep3
Dataset automatically created during the evaluation run of model luffycodes/vicuna-class-shishya-ac-hal-13b-ep3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T17:04:02.519705(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of luffycodes/vicuna-class-shishya-ac-hal-13b-ep3\n\n\n\nDataset automatically created during the evaluation run of model luffycodes/vicuna-class-shishya-ac-hal-13b-ep3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:04:02.519705(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of luffycodes/vicuna-class-shishya-ac-hal-13b-ep3\n\n\n\nDataset automatically created during the evaluation run of model luffycodes/vicuna-class-shishya-ac-hal-13b-ep3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:04:02.519705(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
203,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of luffycodes/vicuna-class-shishya-ac-hal-13b-ep3\n\n\n\nDataset automatically created during the evaluation run of model luffycodes/vicuna-class-shishya-ac-hal-13b-ep3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T17:04:02.519705(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
93c41a9767344c1513e7bc1feb198f58f98ee77e |
# Dataset Card for Evaluation run of luffycodes/vicuna-class-tutor-13b-ep3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [luffycodes/vicuna-class-tutor-13b-ep3](https://huggingface.co/luffycodes/vicuna-class-tutor-13b-ep3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_luffycodes__vicuna-class-tutor-13b-ep3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T17:04:28.342677](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__vicuna-class-tutor-13b-ep3/blob/main/results_2023-12-23T17-04-28.342677.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.565791735957054,
"acc_stderr": 0.03351962241327427,
"acc_norm": 0.5744059102733704,
"acc_norm_stderr": 0.03427708379273797,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5298607855616481,
"mc2_stderr": 0.015266264009722644
},
"harness|arc:challenge|25": {
"acc": 0.5298634812286689,
"acc_stderr": 0.014585305840007105,
"acc_norm": 0.5733788395904437,
"acc_norm_stderr": 0.014453185592920293
},
"harness|hellaswag|10": {
"acc": 0.6179047998406691,
"acc_stderr": 0.004849065962692133,
"acc_norm": 0.8150766779525991,
"acc_norm_stderr": 0.0038744190656586222
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686781,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686781
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.024130158299762613,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.024130158299762613
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572274,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.027807032360686088,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.027807032360686088
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.02506909438729653,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.02506909438729653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.03186608121408831,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.03186608121408831
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.018508143602547832,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.018508143602547832
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294635,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294635
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.0309645179269234,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.0309645179269234
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724147,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724147
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7496807151979565,
"acc_stderr": 0.01549108895149459,
"acc_norm": 0.7496807151979565,
"acc_norm_stderr": 0.01549108895149459
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.615606936416185,
"acc_stderr": 0.026189666966272035,
"acc_norm": 0.615606936416185,
"acc_norm_stderr": 0.026189666966272035
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37206703910614525,
"acc_stderr": 0.0161658475835633,
"acc_norm": 0.37206703910614525,
"acc_norm_stderr": 0.0161658475835633
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468008,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468008
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.02741799670563099,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.02741799670563099
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.026774929899722324,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.026774929899722324
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778855,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.424380704041721,
"acc_stderr": 0.012623343757430017,
"acc_norm": 0.424380704041721,
"acc_norm_stderr": 0.012623343757430017
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5514705882352942,
"acc_stderr": 0.0302114796091216,
"acc_norm": 0.5514705882352942,
"acc_norm_stderr": 0.0302114796091216
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5571895424836601,
"acc_stderr": 0.02009508315457734,
"acc_norm": 0.5571895424836601,
"acc_norm_stderr": 0.02009508315457734
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387634,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387634
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.03036049015401466,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.03036049015401466
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5298607855616481,
"mc2_stderr": 0.015266264009722644
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.012273648008759987
},
"harness|gsm8k|5": {
"acc": 0.12054586808188021,
"acc_stderr": 0.008968608285309085
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_luffycodes__vicuna-class-tutor-13b-ep3 | [
"region:us"
] | 2023-12-23T17:06:48+00:00 | {"pretty_name": "Evaluation run of luffycodes/vicuna-class-tutor-13b-ep3", "dataset_summary": "Dataset automatically created during the evaluation run of model [luffycodes/vicuna-class-tutor-13b-ep3](https://huggingface.co/luffycodes/vicuna-class-tutor-13b-ep3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_luffycodes__vicuna-class-tutor-13b-ep3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T17:04:28.342677](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__vicuna-class-tutor-13b-ep3/blob/main/results_2023-12-23T17-04-28.342677.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.565791735957054,\n \"acc_stderr\": 0.03351962241327427,\n \"acc_norm\": 0.5744059102733704,\n \"acc_norm_stderr\": 0.03427708379273797,\n \"mc1\": 0.3537331701346389,\n \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5298607855616481,\n \"mc2_stderr\": 0.015266264009722644\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5298634812286689,\n \"acc_stderr\": 0.014585305840007105,\n \"acc_norm\": 0.5733788395904437,\n \"acc_norm_stderr\": 0.014453185592920293\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6179047998406691,\n \"acc_stderr\": 0.004849065962692133,\n \"acc_norm\": 0.8150766779525991,\n \"acc_norm_stderr\": 0.0038744190656586222\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.040179012759817494,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.040179012759817494\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686781,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686781\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.024130158299762613,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.024130158299762613\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n \"acc_stderr\": 0.026593084516572274,\n \"acc_norm\": 0.6774193548387096,\n \"acc_norm_stderr\": 0.026593084516572274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.027807032360686088,\n \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.027807032360686088\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.02506909438729653,\n \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.02506909438729653\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.03186608121408831,\n \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.03186608121408831\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7522935779816514,\n \"acc_stderr\": 0.018508143602547832,\n \"acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.018508143602547832\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294635,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294635\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.0309645179269234,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.0309645179269234\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724147,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724147\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n \"acc_stderr\": 0.01549108895149459,\n \"acc_norm\": 0.7496807151979565,\n \"acc_norm_stderr\": 0.01549108895149459\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.026189666966272035,\n \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.026189666966272035\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37206703910614525,\n \"acc_stderr\": 0.0161658475835633,\n \"acc_norm\": 0.37206703910614525,\n \"acc_norm_stderr\": 0.0161658475835633\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468008,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468008\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n \"acc_stderr\": 0.02741799670563099,\n \"acc_norm\": 0.6302250803858521,\n \"acc_norm_stderr\": 0.02741799670563099\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.026774929899722324,\n \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.026774929899722324\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778855,\n \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778855\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.424380704041721,\n \"acc_stderr\": 0.012623343757430017,\n \"acc_norm\": 0.424380704041721,\n \"acc_norm_stderr\": 0.012623343757430017\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.0302114796091216,\n \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.0302114796091216\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5571895424836601,\n \"acc_stderr\": 0.02009508315457734,\n \"acc_norm\": 0.5571895424836601,\n \"acc_norm_stderr\": 0.02009508315457734\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387634,\n \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387634\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n \"acc_stderr\": 0.03036049015401466,\n \"acc_norm\": 0.7562189054726368,\n \"acc_norm_stderr\": 0.03036049015401466\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5298607855616481,\n \"mc2_stderr\": 0.015266264009722644\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759987\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12054586808188021,\n \"acc_stderr\": 0.008968608285309085\n }\n}\n```", "repo_url": "https://huggingface.co/luffycodes/vicuna-class-tutor-13b-ep3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-04-28.342677.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["**/details_harness|winogrande|5_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T17-04-28.342677.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T17_04_28.342677", "path": ["results_2023-12-23T17-04-28.342677.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T17-04-28.342677.parquet"]}]}]} | 2023-12-23T17:07:13+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of luffycodes/vicuna-class-tutor-13b-ep3
Dataset automatically created during the evaluation run of model luffycodes/vicuna-class-tutor-13b-ep3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T17:04:28.342677(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of luffycodes/vicuna-class-tutor-13b-ep3\n\n\n\nDataset automatically created during the evaluation run of model luffycodes/vicuna-class-tutor-13b-ep3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:04:28.342677(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of luffycodes/vicuna-class-tutor-13b-ep3\n\n\n\nDataset automatically created during the evaluation run of model luffycodes/vicuna-class-tutor-13b-ep3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:04:28.342677(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of luffycodes/vicuna-class-tutor-13b-ep3\n\n\n\nDataset automatically created during the evaluation run of model luffycodes/vicuna-class-tutor-13b-ep3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T17:04:28.342677(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
dcbba3cac00cd989cde46b630f4b0849f5d160f2 |
# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_glaive_code_v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mwitiderrick/open_llama_3b_glaive_code_v0.1](https://huggingface.co/mwitiderrick/open_llama_3b_glaive_code_v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mwitiderrick__open_llama_3b_glaive_code_v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T17:05:08.212858](https://huggingface.co/datasets/open-llm-leaderboard/details_mwitiderrick__open_llama_3b_glaive_code_v0.1/blob/main/results_2023-12-23T17-05-08.212858.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2843747535406573,
"acc_stderr": 0.031689110133124844,
"acc_norm": 0.28633888958645765,
"acc_norm_stderr": 0.03246963675970039,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871114,
"mc2": 0.3585983664640556,
"mc2_stderr": 0.013742745779138914
},
"harness|arc:challenge|25": {
"acc": 0.3703071672354949,
"acc_stderr": 0.01411129875167495,
"acc_norm": 0.4069965870307167,
"acc_norm_stderr": 0.014356399418009131
},
"harness|hellaswag|10": {
"acc": 0.4971121290579566,
"acc_stderr": 0.004989698183207831,
"acc_norm": 0.6744672376020713,
"acc_norm_stderr": 0.004676159299105414
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.03885004245800254,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.03885004245800254
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.29056603773584905,
"acc_stderr": 0.027943219989337156,
"acc_norm": 0.29056603773584905,
"acc_norm_stderr": 0.027943219989337156
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.035676037996391685,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.035676037996391685
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.030363582197238167,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.030363582197238167
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746325,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113932,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113932
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.03619604524124252,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.03619604524124252
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233484,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233484
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.035887028128263714,
"acc_norm": 0.15,
"acc_norm_stderr": 0.035887028128263714
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.034531318018854146,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.034531318018854146
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.31313131313131315,
"acc_stderr": 0.03304205087813653,
"acc_norm": 0.31313131313131315,
"acc_norm_stderr": 0.03304205087813653
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.03221024508041154,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.03221024508041154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.32564102564102565,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.32564102564102565,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2773109243697479,
"acc_stderr": 0.029079374539480007,
"acc_norm": 0.2773109243697479,
"acc_norm_stderr": 0.029079374539480007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360383,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360383
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.27339449541284405,
"acc_stderr": 0.019109299846098275,
"acc_norm": 0.27339449541284405,
"acc_norm_stderr": 0.019109299846098275
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.028304657943035303,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.028304657943035303
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3452914798206278,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.3452914798206278,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2975206611570248,
"acc_stderr": 0.041733491480834974,
"acc_norm": 0.2975206611570248,
"acc_norm_stderr": 0.041733491480834974
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467764,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467764
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.02905858830374884,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.02905858830374884
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28991060025542786,
"acc_stderr": 0.016225017944770957,
"acc_norm": 0.28991060025542786,
"acc_norm_stderr": 0.016225017944770957
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.02353292543104428,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.02353292543104428
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.025457756696667878,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.025457756696667878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3183279742765273,
"acc_stderr": 0.026457225067811032,
"acc_norm": 0.3183279742765273,
"acc_norm_stderr": 0.026457225067811032
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2839506172839506,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.2839506172839506,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.025892151156709405,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.025892151156709405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.242503259452412,
"acc_stderr": 0.01094657096634878,
"acc_norm": 0.242503259452412,
"acc_norm_stderr": 0.01094657096634878
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4522058823529412,
"acc_stderr": 0.030233758551596455,
"acc_norm": 0.4522058823529412,
"acc_norm_stderr": 0.030233758551596455
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.017401816711427657,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.017401816711427657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.21224489795918366,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.21224489795918366,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729601,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729601
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871114,
"mc2": 0.3585983664640556,
"mc2_stderr": 0.013742745779138914
},
"harness|winogrande|5": {
"acc": 0.6471981057616417,
"acc_stderr": 0.013429728101788961
},
"harness|gsm8k|5": {
"acc": 0.019711902956785442,
"acc_stderr": 0.003828982978735702
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_mwitiderrick__open_llama_3b_glaive_code_v0.1 | [
"region:us"
] | 2023-12-23T17:06:57+00:00 | {"pretty_name": "Evaluation run of mwitiderrick/open_llama_3b_glaive_code_v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [mwitiderrick/open_llama_3b_glaive_code_v0.1](https://huggingface.co/mwitiderrick/open_llama_3b_glaive_code_v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mwitiderrick__open_llama_3b_glaive_code_v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T17:05:08.212858](https://huggingface.co/datasets/open-llm-leaderboard/details_mwitiderrick__open_llama_3b_glaive_code_v0.1/blob/main/results_2023-12-23T17-05-08.212858.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2843747535406573,\n \"acc_stderr\": 0.031689110133124844,\n \"acc_norm\": 0.28633888958645765,\n \"acc_norm_stderr\": 0.03246963675970039,\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871114,\n \"mc2\": 0.3585983664640556,\n \"mc2_stderr\": 0.013742745779138914\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3703071672354949,\n \"acc_stderr\": 0.01411129875167495,\n \"acc_norm\": 0.4069965870307167,\n \"acc_norm_stderr\": 0.014356399418009131\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4971121290579566,\n \"acc_stderr\": 0.004989698183207831,\n \"acc_norm\": 0.6744672376020713,\n \"acc_norm_stderr\": 0.004676159299105414\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.03885004245800254,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.03885004245800254\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.29056603773584905,\n \"acc_stderr\": 0.027943219989337156,\n \"acc_norm\": 0.29056603773584905,\n \"acc_norm_stderr\": 0.027943219989337156\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n \"acc_stderr\": 0.035676037996391685,\n \"acc_norm\": 0.3236994219653179,\n \"acc_norm_stderr\": 0.035676037996391685\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.030363582197238167,\n \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.030363582197238167\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746325\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113932,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113932\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n \"acc_stderr\": 0.03619604524124252,\n \"acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.03619604524124252\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.2645161290322581,\n \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.15,\n \"acc_stderr\": 0.035887028128263714,\n \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.035887028128263714\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.034531318018854146,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.034531318018854146\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.31313131313131315,\n \"acc_stderr\": 0.03304205087813653,\n \"acc_norm\": 0.31313131313131315,\n \"acc_norm_stderr\": 0.03304205087813653\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.03221024508041154,\n \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.03221024508041154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.32564102564102565,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.32564102564102565,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2773109243697479,\n \"acc_stderr\": 0.029079374539480007,\n \"acc_norm\": 0.2773109243697479,\n \"acc_norm_stderr\": 0.029079374539480007\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360383,\n \"acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360383\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.27339449541284405,\n \"acc_stderr\": 0.019109299846098275,\n \"acc_norm\": 0.27339449541284405,\n \"acc_norm_stderr\": 0.019109299846098275\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035303,\n \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035303\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3452914798206278,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.3452914798206278,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.03641297081313728,\n \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.03641297081313728\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2975206611570248,\n \"acc_stderr\": 0.041733491480834974,\n \"acc_norm\": 0.2975206611570248,\n \"acc_norm_stderr\": 0.041733491480834974\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n \"acc_stderr\": 0.04203277291467764,\n \"acc_norm\": 0.26785714285714285,\n \"acc_norm_stderr\": 0.04203277291467764\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.02905858830374884,\n \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.02905858830374884\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28991060025542786,\n \"acc_stderr\": 0.016225017944770957,\n \"acc_norm\": 0.28991060025542786,\n \"acc_norm_stderr\": 0.016225017944770957\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.02353292543104428,\n \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.02353292543104428\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.27124183006535946,\n \"acc_stderr\": 0.025457756696667878,\n \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.025457756696667878\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3183279742765273,\n \"acc_stderr\": 0.026457225067811032,\n \"acc_norm\": 0.3183279742765273,\n \"acc_norm_stderr\": 0.026457225067811032\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2839506172839506,\n \"acc_stderr\": 0.025089478523765134,\n \"acc_norm\": 0.2839506172839506,\n \"acc_norm_stderr\": 0.025089478523765134\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25177304964539005,\n \"acc_stderr\": 0.025892151156709405,\n \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.025892151156709405\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.242503259452412,\n \"acc_stderr\": 0.01094657096634878,\n \"acc_norm\": 0.242503259452412,\n \"acc_norm_stderr\": 0.01094657096634878\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4522058823529412,\n \"acc_stderr\": 0.030233758551596455,\n \"acc_norm\": 0.4522058823529412,\n \"acc_norm_stderr\": 0.030233758551596455\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.017401816711427657,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.017401816711427657\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.21224489795918366,\n \"acc_stderr\": 0.026176967197866767,\n \"acc_norm\": 0.21224489795918366,\n \"acc_norm_stderr\": 0.026176967197866767\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n \"acc_stderr\": 0.03076944496729601,\n \"acc_norm\": 0.2537313432835821,\n \"acc_norm_stderr\": 0.03076944496729601\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.0340105262010409,\n \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.0340105262010409\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871114,\n \"mc2\": 0.3585983664640556,\n \"mc2_stderr\": 0.013742745779138914\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6471981057616417,\n \"acc_stderr\": 0.013429728101788961\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.019711902956785442,\n \"acc_stderr\": 0.003828982978735702\n }\n}\n```", "repo_url": "https://huggingface.co/mwitiderrick/open_llama_3b_glaive_code_v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-05-08.212858.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["**/details_harness|winogrande|5_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T17-05-08.212858.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T17_05_08.212858", "path": ["results_2023-12-23T17-05-08.212858.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T17-05-08.212858.parquet"]}]}]} | 2023-12-23T17:07:18+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_glaive_code_v0.1
Dataset automatically created during the evaluation run of model mwitiderrick/open_llama_3b_glaive_code_v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T17:05:08.212858(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_glaive_code_v0.1\n\n\n\nDataset automatically created during the evaluation run of model mwitiderrick/open_llama_3b_glaive_code_v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:05:08.212858(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_glaive_code_v0.1\n\n\n\nDataset automatically created during the evaluation run of model mwitiderrick/open_llama_3b_glaive_code_v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:05:08.212858(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
201,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_glaive_code_v0.1\n\n\n\nDataset automatically created during the evaluation run of model mwitiderrick/open_llama_3b_glaive_code_v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T17:05:08.212858(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
448be9efa95638c32b35c83c8f64b927537e7c8e |
# Dataset Card for Evaluation run of yhyu13/LMCocktail-10.7B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yhyu13/LMCocktail-10.7B-v1](https://huggingface.co/yhyu13/LMCocktail-10.7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yhyu13__LMCocktail-10.7B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T17:18:52.546076](https://huggingface.co/datasets/open-llm-leaderboard/details_yhyu13__LMCocktail-10.7B-v1/blob/main/results_2023-12-23T17-18-52.546076.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6656979362421428,
"acc_stderr": 0.031660298381466584,
"acc_norm": 0.6665217090107124,
"acc_norm_stderr": 0.032305792594458954,
"mc1": 0.5642594859241126,
"mc1_stderr": 0.01735834539886313,
"mc2": 0.7102777882533455,
"mc2_stderr": 0.015039392112656383
},
"harness|arc:challenge|25": {
"acc": 0.681740614334471,
"acc_stderr": 0.013611993916971453,
"acc_norm": 0.7064846416382252,
"acc_norm_stderr": 0.013307250444941108
},
"harness|hellaswag|10": {
"acc": 0.7056363274248157,
"acc_stderr": 0.004548247487546323,
"acc_norm": 0.8812985461063533,
"acc_norm_stderr": 0.0032277587155456044
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562429,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562429
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03583496176361072,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03583496176361072
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736413,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736413
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6297872340425532,
"acc_stderr": 0.03156564682236785,
"acc_norm": 0.6297872340425532,
"acc_norm_stderr": 0.03156564682236785
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4708994708994709,
"acc_stderr": 0.02570765861415496,
"acc_norm": 0.4708994708994709,
"acc_norm_stderr": 0.02570765861415496
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823078,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823078
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.02366435940288023,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.02366435940288023
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633506,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633506
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.029502861128955286,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.029502861128955286
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374308,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.0230866350868414,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.0230866350868414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8033205619412516,
"acc_stderr": 0.014214138556913917,
"acc_norm": 0.8033205619412516,
"acc_norm_stderr": 0.014214138556913917
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3787709497206704,
"acc_stderr": 0.01622353351036512,
"acc_norm": 0.3787709497206704,
"acc_norm_stderr": 0.01622353351036512
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.024404394928087866,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.024404394928087866
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341062,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341062
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7932098765432098,
"acc_stderr": 0.02253500670594284,
"acc_norm": 0.7932098765432098,
"acc_norm_stderr": 0.02253500670594284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4954367666232073,
"acc_stderr": 0.012769704263117519,
"acc_norm": 0.4954367666232073,
"acc_norm_stderr": 0.012769704263117519
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7316176470588235,
"acc_stderr": 0.02691748122437721,
"acc_norm": 0.7316176470588235,
"acc_norm_stderr": 0.02691748122437721
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.018690850273595294,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.018690850273595294
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5642594859241126,
"mc1_stderr": 0.01735834539886313,
"mc2": 0.7102777882533455,
"mc2_stderr": 0.015039392112656383
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.010470796496781091
},
"harness|gsm8k|5": {
"acc": 0.6497346474601972,
"acc_stderr": 0.013140409455571284
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Yhyu13__LMCocktail-10.7B-v1 | [
"region:us"
] | 2023-12-23T17:07:44+00:00 | {"pretty_name": "Evaluation run of yhyu13/LMCocktail-10.7B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [yhyu13/LMCocktail-10.7B-v1](https://huggingface.co/yhyu13/LMCocktail-10.7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yhyu13__LMCocktail-10.7B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T17:18:52.546076](https://huggingface.co/datasets/open-llm-leaderboard/details_yhyu13__LMCocktail-10.7B-v1/blob/main/results_2023-12-23T17-18-52.546076.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6656979362421428,\n \"acc_stderr\": 0.031660298381466584,\n \"acc_norm\": 0.6665217090107124,\n \"acc_norm_stderr\": 0.032305792594458954,\n \"mc1\": 0.5642594859241126,\n \"mc1_stderr\": 0.01735834539886313,\n \"mc2\": 0.7102777882533455,\n \"mc2_stderr\": 0.015039392112656383\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.681740614334471,\n \"acc_stderr\": 0.013611993916971453,\n \"acc_norm\": 0.7064846416382252,\n \"acc_norm_stderr\": 0.013307250444941108\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7056363274248157,\n \"acc_stderr\": 0.004548247487546323,\n \"acc_norm\": 0.8812985461063533,\n \"acc_norm_stderr\": 0.0032277587155456044\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562429,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562429\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361072,\n \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361072\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736413,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736413\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6297872340425532,\n \"acc_stderr\": 0.03156564682236785,\n \"acc_norm\": 0.6297872340425532,\n \"acc_norm_stderr\": 0.03156564682236785\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4708994708994709,\n \"acc_stderr\": 0.02570765861415496,\n \"acc_norm\": 0.4708994708994709,\n \"acc_norm_stderr\": 0.02570765861415496\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n \"acc_stderr\": 0.022331707611823078,\n \"acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.022331707611823078\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8737373737373737,\n \"acc_stderr\": 0.02366435940288023,\n \"acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.02366435940288023\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633506,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633506\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37407407407407406,\n \"acc_stderr\": 0.029502861128955286,\n \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.029502861128955286\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n \"acc_stderr\": 0.014214138556913917,\n \"acc_norm\": 0.8033205619412516,\n \"acc_norm_stderr\": 0.014214138556913917\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3787709497206704,\n \"acc_stderr\": 0.01622353351036512,\n \"acc_norm\": 0.3787709497206704,\n \"acc_norm_stderr\": 0.01622353351036512\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.024404394928087866,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.024404394928087866\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n \"acc_stderr\": 0.02521804037341062,\n \"acc_norm\": 0.729903536977492,\n \"acc_norm_stderr\": 0.02521804037341062\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7932098765432098,\n \"acc_stderr\": 0.02253500670594284,\n \"acc_norm\": 0.7932098765432098,\n \"acc_norm_stderr\": 0.02253500670594284\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4954367666232073,\n \"acc_stderr\": 0.012769704263117519,\n \"acc_norm\": 0.4954367666232073,\n \"acc_norm_stderr\": 0.012769704263117519\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7316176470588235,\n \"acc_stderr\": 0.02691748122437721,\n \"acc_norm\": 0.7316176470588235,\n \"acc_norm_stderr\": 0.02691748122437721\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.018690850273595294,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.018690850273595294\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5642594859241126,\n \"mc1_stderr\": 0.01735834539886313,\n \"mc2\": 0.7102777882533455,\n \"mc2_stderr\": 0.015039392112656383\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781091\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6497346474601972,\n \"acc_stderr\": 0.013140409455571284\n }\n}\n```", "repo_url": "https://huggingface.co/yhyu13/LMCocktail-10.7B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-05-29.674400.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-18-52.546076.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["**/details_harness|winogrande|5_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["**/details_harness|winogrande|5_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T17-18-52.546076.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T17_05_29.674400", "path": ["results_2023-12-23T17-05-29.674400.parquet"]}, {"split": "2023_12_23T17_18_52.546076", "path": ["results_2023-12-23T17-18-52.546076.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T17-18-52.546076.parquet"]}]}]} | 2023-12-23T17:21:15+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of yhyu13/LMCocktail-10.7B-v1
Dataset automatically created during the evaluation run of model yhyu13/LMCocktail-10.7B-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T17:18:52.546076(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of yhyu13/LMCocktail-10.7B-v1\n\n\n\nDataset automatically created during the evaluation run of model yhyu13/LMCocktail-10.7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:18:52.546076(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of yhyu13/LMCocktail-10.7B-v1\n\n\n\nDataset automatically created during the evaluation run of model yhyu13/LMCocktail-10.7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:18:52.546076(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yhyu13/LMCocktail-10.7B-v1\n\n\n\nDataset automatically created during the evaluation run of model yhyu13/LMCocktail-10.7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T17:18:52.546076(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
ddafa7b58824fb4a41c33cdb0c8bcff0843f4471 |
# Dataset Card for Evaluation run of joey00072/ToxicHermes-2.5-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [joey00072/ToxicHermes-2.5-Mistral-7B](https://huggingface.co/joey00072/ToxicHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_joey00072__ToxicHermes-2.5-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T17:12:50.867091](https://huggingface.co/datasets/open-llm-leaderboard/details_joey00072__ToxicHermes-2.5-Mistral-7B/blob/main/results_2023-12-23T17-12-50.867091.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6310556791538692,
"acc_stderr": 0.032203868447530745,
"acc_norm": 0.6402886061157618,
"acc_norm_stderr": 0.032897055185662744,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5083945294452454,
"mc2_stderr": 0.015230833666821306
},
"harness|arc:challenge|25": {
"acc": 0.6032423208191127,
"acc_stderr": 0.014296513020180646,
"acc_norm": 0.6459044368600683,
"acc_norm_stderr": 0.013975454122756562
},
"harness|hellaswag|10": {
"acc": 0.6448914558852819,
"acc_stderr": 0.004775681871529863,
"acc_norm": 0.8374825731925911,
"acc_norm_stderr": 0.003681708282581456
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.02315787934908353,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.02315787934908353
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276877,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276877
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608311,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608311
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30837988826815643,
"acc_stderr": 0.015445716910998884,
"acc_norm": 0.30837988826815643,
"acc_norm_stderr": 0.015445716910998884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146293,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146293
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162662,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162662
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5083945294452454,
"mc2_stderr": 0.015230833666821306
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643414
},
"harness|gsm8k|5": {
"acc": 0.17361637604245642,
"acc_stderr": 0.01043346322125761
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_joey00072__ToxicHermes-2.5-Mistral-7B | [
"region:us"
] | 2023-12-23T17:15:07+00:00 | {"pretty_name": "Evaluation run of joey00072/ToxicHermes-2.5-Mistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [joey00072/ToxicHermes-2.5-Mistral-7B](https://huggingface.co/joey00072/ToxicHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_joey00072__ToxicHermes-2.5-Mistral-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T17:12:50.867091](https://huggingface.co/datasets/open-llm-leaderboard/details_joey00072__ToxicHermes-2.5-Mistral-7B/blob/main/results_2023-12-23T17-12-50.867091.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6310556791538692,\n \"acc_stderr\": 0.032203868447530745,\n \"acc_norm\": 0.6402886061157618,\n \"acc_norm_stderr\": 0.032897055185662744,\n \"mc1\": 0.35128518971848227,\n \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5083945294452454,\n \"mc2_stderr\": 0.015230833666821306\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6032423208191127,\n \"acc_stderr\": 0.014296513020180646,\n \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.013975454122756562\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6448914558852819,\n \"acc_stderr\": 0.004775681871529863,\n \"acc_norm\": 0.8374825731925911,\n \"acc_norm_stderr\": 0.003681708282581456\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.02315787934908353,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.02315787934908353\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608311,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608311\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30837988826815643,\n \"acc_stderr\": 0.015445716910998884,\n \"acc_norm\": 0.30837988826815643,\n \"acc_norm_stderr\": 0.015445716910998884\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146293,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146293\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162662,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162662\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5083945294452454,\n \"mc2_stderr\": 0.015230833666821306\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643414\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17361637604245642,\n \"acc_stderr\": 0.01043346322125761\n }\n}\n```", "repo_url": "https://huggingface.co/joey00072/ToxicHermes-2.5-Mistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-12-50.867091.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["**/details_harness|winogrande|5_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T17-12-50.867091.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T17_12_50.867091", "path": ["results_2023-12-23T17-12-50.867091.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T17-12-50.867091.parquet"]}]}]} | 2023-12-23T17:15:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of joey00072/ToxicHermes-2.5-Mistral-7B
Dataset automatically created during the evaluation run of model joey00072/ToxicHermes-2.5-Mistral-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T17:12:50.867091(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of joey00072/ToxicHermes-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model joey00072/ToxicHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:12:50.867091(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of joey00072/ToxicHermes-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model joey00072/ToxicHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:12:50.867091(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of joey00072/ToxicHermes-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model joey00072/ToxicHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T17:12:50.867091(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
77b0d830f5e94d1b76dd07764170db892a169cb6 |
# Dataset Card for Evaluation run of vikash06/llama-2-7b-small-model-new
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vikash06/llama-2-7b-small-model-new](https://huggingface.co/vikash06/llama-2-7b-small-model-new) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vikash06__llama-2-7b-small-model-new",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T17:13:47.425538](https://huggingface.co/datasets/open-llm-leaderboard/details_vikash06__llama-2-7b-small-model-new/blob/main/results_2023-12-23T17-13-47.425538.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.45995921713524485,
"acc_stderr": 0.034705933213305985,
"acc_norm": 0.46657860357627995,
"acc_norm_stderr": 0.035509017738943494,
"mc1": 0.2717258261933905,
"mc1_stderr": 0.015572840452875833,
"mc2": 0.42463864717462163,
"mc2_stderr": 0.016089627327060634
},
"harness|arc:challenge|25": {
"acc": 0.4308873720136519,
"acc_stderr": 0.014471133392642459,
"acc_norm": 0.4522184300341297,
"acc_norm_stderr": 0.014544519880633823
},
"harness|hellaswag|10": {
"acc": 0.5393347938657638,
"acc_stderr": 0.004974316807920403,
"acc_norm": 0.7234614618601872,
"acc_norm_stderr": 0.004463721071319091
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.49056603773584906,
"acc_stderr": 0.030767394707808093,
"acc_norm": 0.49056603773584906,
"acc_norm_stderr": 0.030767394707808093
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4513888888888889,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.4513888888888889,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374768,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374768
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.023695415009463087,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.023695415009463087
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5064516129032258,
"acc_stderr": 0.02844163823354051,
"acc_norm": 0.5064516129032258,
"acc_norm_stderr": 0.02844163823354051
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969565,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969565
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5696969696969697,
"acc_stderr": 0.03866225962879077,
"acc_norm": 0.5696969696969697,
"acc_norm_stderr": 0.03866225962879077
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5404040404040404,
"acc_stderr": 0.035507024651313425,
"acc_norm": 0.5404040404040404,
"acc_norm_stderr": 0.035507024651313425
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6787564766839378,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.6787564766839378,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43846153846153846,
"acc_stderr": 0.02515826601686856,
"acc_norm": 0.43846153846153846,
"acc_norm_stderr": 0.02515826601686856
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230186,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230186
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4579831932773109,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.4579831932773109,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6146788990825688,
"acc_stderr": 0.020865850852794122,
"acc_norm": 0.6146788990825688,
"acc_norm_stderr": 0.020865850852794122
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.03203614084670058,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.03203614084670058
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03503235296367992,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03503235296367992
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6075949367088608,
"acc_stderr": 0.03178471874564729,
"acc_norm": 0.6075949367088608,
"acc_norm_stderr": 0.03178471874564729
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.57847533632287,
"acc_stderr": 0.03314190222110657,
"acc_norm": 0.57847533632287,
"acc_norm_stderr": 0.03314190222110657
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.04374928560599738,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.04374928560599738
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5867768595041323,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.5867768595041323,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437056,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437056
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.48466257668711654,
"acc_stderr": 0.039265223787088424,
"acc_norm": 0.48466257668711654,
"acc_norm_stderr": 0.039265223787088424
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.5436893203883495,
"acc_stderr": 0.049318019942204146,
"acc_norm": 0.5436893203883495,
"acc_norm_stderr": 0.049318019942204146
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.0311669573672359,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.0311669573672359
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6155810983397191,
"acc_stderr": 0.01739568874281962,
"acc_norm": 0.6155810983397191,
"acc_norm_stderr": 0.01739568874281962
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5057803468208093,
"acc_stderr": 0.026917296179149116,
"acc_norm": 0.5057803468208093,
"acc_norm_stderr": 0.026917296179149116
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.028624412550167958,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.028624412550167958
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.572347266881029,
"acc_stderr": 0.02809924077580956,
"acc_norm": 0.572347266881029,
"acc_norm_stderr": 0.02809924077580956
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.49382716049382713,
"acc_stderr": 0.027818623962583295,
"acc_norm": 0.49382716049382713,
"acc_norm_stderr": 0.027818623962583295
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.028267657482650147,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.028267657482650147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3604954367666232,
"acc_stderr": 0.01226311023729923,
"acc_norm": 0.3604954367666232,
"acc_norm_stderr": 0.01226311023729923
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45955882352941174,
"acc_stderr": 0.03027332507734575,
"acc_norm": 0.45955882352941174,
"acc_norm_stderr": 0.03027332507734575
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.01994491413687358,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.01994491413687358
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4897959183673469,
"acc_stderr": 0.03200255347893782,
"acc_norm": 0.4897959183673469,
"acc_norm_stderr": 0.03200255347893782
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6268656716417911,
"acc_stderr": 0.034198326081760065,
"acc_norm": 0.6268656716417911,
"acc_norm_stderr": 0.034198326081760065
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2717258261933905,
"mc1_stderr": 0.015572840452875833,
"mc2": 0.42463864717462163,
"mc2_stderr": 0.016089627327060634
},
"harness|winogrande|5": {
"acc": 0.6393054459352802,
"acc_stderr": 0.013496064394234033
},
"harness|gsm8k|5": {
"acc": 0.09552691432903715,
"acc_stderr": 0.008096605771155731
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_vikash06__llama-2-7b-small-model-new | [
"region:us"
] | 2023-12-23T17:16:06+00:00 | {"pretty_name": "Evaluation run of vikash06/llama-2-7b-small-model-new", "dataset_summary": "Dataset automatically created during the evaluation run of model [vikash06/llama-2-7b-small-model-new](https://huggingface.co/vikash06/llama-2-7b-small-model-new) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vikash06__llama-2-7b-small-model-new\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T17:13:47.425538](https://huggingface.co/datasets/open-llm-leaderboard/details_vikash06__llama-2-7b-small-model-new/blob/main/results_2023-12-23T17-13-47.425538.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.45995921713524485,\n \"acc_stderr\": 0.034705933213305985,\n \"acc_norm\": 0.46657860357627995,\n \"acc_norm_stderr\": 0.035509017738943494,\n \"mc1\": 0.2717258261933905,\n \"mc1_stderr\": 0.015572840452875833,\n \"mc2\": 0.42463864717462163,\n \"mc2_stderr\": 0.016089627327060634\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4308873720136519,\n \"acc_stderr\": 0.014471133392642459,\n \"acc_norm\": 0.4522184300341297,\n \"acc_norm_stderr\": 0.014544519880633823\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5393347938657638,\n \"acc_stderr\": 0.004974316807920403,\n \"acc_norm\": 0.7234614618601872,\n \"acc_norm_stderr\": 0.004463721071319091\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.49056603773584906,\n \"acc_stderr\": 0.030767394707808093,\n \"acc_norm\": 0.49056603773584906,\n \"acc_norm_stderr\": 0.030767394707808093\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4513888888888889,\n \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.4513888888888889,\n \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.42196531791907516,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.04404556157374768,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.04404556157374768\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30423280423280424,\n \"acc_stderr\": 0.023695415009463087,\n \"acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.023695415009463087\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5064516129032258,\n \"acc_stderr\": 0.02844163823354051,\n \"acc_norm\": 0.5064516129032258,\n \"acc_norm_stderr\": 0.02844163823354051\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969565,\n \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969565\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5696969696969697,\n \"acc_stderr\": 0.03866225962879077,\n \"acc_norm\": 0.5696969696969697,\n \"acc_norm_stderr\": 0.03866225962879077\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5404040404040404,\n \"acc_stderr\": 0.035507024651313425,\n \"acc_norm\": 0.5404040404040404,\n \"acc_norm_stderr\": 0.035507024651313425\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.43846153846153846,\n \"acc_stderr\": 0.02515826601686856,\n \"acc_norm\": 0.43846153846153846,\n \"acc_norm_stderr\": 0.02515826601686856\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230186,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230186\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4579831932773109,\n \"acc_stderr\": 0.03236361111951941,\n \"acc_norm\": 0.4579831932773109,\n \"acc_norm_stderr\": 0.03236361111951941\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6146788990825688,\n \"acc_stderr\": 0.020865850852794122,\n \"acc_norm\": 0.6146788990825688,\n \"acc_norm_stderr\": 0.020865850852794122\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3287037037037037,\n \"acc_stderr\": 0.03203614084670058,\n \"acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.03203614084670058\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03503235296367992,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03503235296367992\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6075949367088608,\n \"acc_stderr\": 0.03178471874564729,\n \"acc_norm\": 0.6075949367088608,\n \"acc_norm_stderr\": 0.03178471874564729\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.57847533632287,\n \"acc_stderr\": 0.03314190222110657,\n \"acc_norm\": 0.57847533632287,\n \"acc_norm_stderr\": 0.03314190222110657\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.04374928560599738,\n \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.04374928560599738\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5867768595041323,\n \"acc_stderr\": 0.04495087843548408,\n \"acc_norm\": 0.5867768595041323,\n \"acc_norm_stderr\": 0.04495087843548408\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.04832853553437056,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.04832853553437056\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.48466257668711654,\n \"acc_stderr\": 0.039265223787088424,\n \"acc_norm\": 0.48466257668711654,\n \"acc_norm_stderr\": 0.039265223787088424\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5436893203883495,\n \"acc_stderr\": 0.049318019942204146,\n \"acc_norm\": 0.5436893203883495,\n \"acc_norm_stderr\": 0.049318019942204146\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.0311669573672359,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.0311669573672359\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6155810983397191,\n \"acc_stderr\": 0.01739568874281962,\n \"acc_norm\": 0.6155810983397191,\n \"acc_norm_stderr\": 0.01739568874281962\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5057803468208093,\n \"acc_stderr\": 0.026917296179149116,\n \"acc_norm\": 0.5057803468208093,\n \"acc_norm_stderr\": 0.026917296179149116\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.028624412550167958,\n \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.028624412550167958\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.572347266881029,\n \"acc_stderr\": 0.02809924077580956,\n \"acc_norm\": 0.572347266881029,\n \"acc_norm_stderr\": 0.02809924077580956\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.49382716049382713,\n \"acc_stderr\": 0.027818623962583295,\n \"acc_norm\": 0.49382716049382713,\n \"acc_norm_stderr\": 0.027818623962583295\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650147,\n \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650147\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3604954367666232,\n \"acc_stderr\": 0.01226311023729923,\n \"acc_norm\": 0.3604954367666232,\n \"acc_norm_stderr\": 0.01226311023729923\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.45955882352941174,\n \"acc_stderr\": 0.03027332507734575,\n \"acc_norm\": 0.45955882352941174,\n \"acc_norm_stderr\": 0.03027332507734575\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.01994491413687358,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.01994491413687358\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4897959183673469,\n \"acc_stderr\": 0.03200255347893782,\n \"acc_norm\": 0.4897959183673469,\n \"acc_norm_stderr\": 0.03200255347893782\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6268656716417911,\n \"acc_stderr\": 0.034198326081760065,\n \"acc_norm\": 0.6268656716417911,\n \"acc_norm_stderr\": 0.034198326081760065\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.41566265060240964,\n \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2717258261933905,\n \"mc1_stderr\": 0.015572840452875833,\n \"mc2\": 0.42463864717462163,\n \"mc2_stderr\": 0.016089627327060634\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6393054459352802,\n \"acc_stderr\": 0.013496064394234033\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09552691432903715,\n \"acc_stderr\": 0.008096605771155731\n }\n}\n```", "repo_url": "https://huggingface.co/vikash06/llama-2-7b-small-model-new", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-13-47.425538.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["**/details_harness|winogrande|5_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T17-13-47.425538.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T17_13_47.425538", "path": ["results_2023-12-23T17-13-47.425538.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T17-13-47.425538.parquet"]}]}]} | 2023-12-23T17:16:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of vikash06/llama-2-7b-small-model-new
Dataset automatically created during the evaluation run of model vikash06/llama-2-7b-small-model-new on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T17:13:47.425538(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of vikash06/llama-2-7b-small-model-new\n\n\n\nDataset automatically created during the evaluation run of model vikash06/llama-2-7b-small-model-new on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:13:47.425538(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of vikash06/llama-2-7b-small-model-new\n\n\n\nDataset automatically created during the evaluation run of model vikash06/llama-2-7b-small-model-new on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:13:47.425538(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of vikash06/llama-2-7b-small-model-new\n\n\n\nDataset automatically created during the evaluation run of model vikash06/llama-2-7b-small-model-new on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T17:13:47.425538(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
6b4894e5fc6c6b3f4fd71d919ccd7eeba4b221d0 |
# Dataset Card for Evaluation run of Zangs3011/mistral_7b_DolphinCoder
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Zangs3011/mistral_7b_DolphinCoder](https://huggingface.co/Zangs3011/mistral_7b_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Zangs3011__mistral_7b_DolphinCoder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T17:18:24.338382](https://huggingface.co/datasets/open-llm-leaderboard/details_Zangs3011__mistral_7b_DolphinCoder/blob/main/results_2023-12-23T17-18-24.338382.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5955742366975546,
"acc_stderr": 0.032892026757812796,
"acc_norm": 0.6023520874451797,
"acc_norm_stderr": 0.03357558761791825,
"mc1": 0.2998776009791922,
"mc1_stderr": 0.01604035296671362,
"mc2": 0.43954153886534947,
"mc2_stderr": 0.014894783303440727
},
"harness|arc:challenge|25": {
"acc": 0.568259385665529,
"acc_stderr": 0.014474591427196204,
"acc_norm": 0.5972696245733788,
"acc_norm_stderr": 0.014332236306790149
},
"harness|hellaswag|10": {
"acc": 0.628460466042621,
"acc_stderr": 0.004822286556305222,
"acc_norm": 0.8163712407886875,
"acc_norm_stderr": 0.003863898546941602
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6452830188679245,
"acc_stderr": 0.02944517532819959,
"acc_norm": 0.6452830188679245,
"acc_norm_stderr": 0.02944517532819959
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077615,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077615
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.0416656757710158,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.0416656757710158
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6903225806451613,
"acc_stderr": 0.026302774983517418,
"acc_norm": 0.6903225806451613,
"acc_norm_stderr": 0.026302774983517418
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.025106820660539753,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.025106820660539753
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524586,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524586
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096625,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096625
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.017149858514250948,
"acc_norm": 0.8,
"acc_norm_stderr": 0.017149858514250948
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.033809398139433545,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.033809398139433545
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335428,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335428
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.014866821664709583,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.014866821664709583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.01435591196476786,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.01435591196476786
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666787,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.02623696588115326,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.02623696588115326
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579922,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.0293922365846125,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.0293922365846125
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43415906127770537,
"acc_stderr": 0.012659033237067248,
"acc_norm": 0.43415906127770537,
"acc_norm_stderr": 0.012659033237067248
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.019691459052354015,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.019691459052354015
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789855,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789855
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368032,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368032
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2998776009791922,
"mc1_stderr": 0.01604035296671362,
"mc2": 0.43954153886534947,
"mc2_stderr": 0.014894783303440727
},
"harness|winogrande|5": {
"acc": 0.7458563535911602,
"acc_stderr": 0.012236307219708267
},
"harness|gsm8k|5": {
"acc": 0.2623199393479909,
"acc_stderr": 0.012116912419925704
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Zangs3011__mistral_7b_DolphinCoder | [
"region:us"
] | 2023-12-23T17:20:42+00:00 | {"pretty_name": "Evaluation run of Zangs3011/mistral_7b_DolphinCoder", "dataset_summary": "Dataset automatically created during the evaluation run of model [Zangs3011/mistral_7b_DolphinCoder](https://huggingface.co/Zangs3011/mistral_7b_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Zangs3011__mistral_7b_DolphinCoder\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T17:18:24.338382](https://huggingface.co/datasets/open-llm-leaderboard/details_Zangs3011__mistral_7b_DolphinCoder/blob/main/results_2023-12-23T17-18-24.338382.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5955742366975546,\n \"acc_stderr\": 0.032892026757812796,\n \"acc_norm\": 0.6023520874451797,\n \"acc_norm_stderr\": 0.03357558761791825,\n \"mc1\": 0.2998776009791922,\n \"mc1_stderr\": 0.01604035296671362,\n \"mc2\": 0.43954153886534947,\n \"mc2_stderr\": 0.014894783303440727\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.568259385665529,\n \"acc_stderr\": 0.014474591427196204,\n \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.014332236306790149\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.628460466042621,\n \"acc_stderr\": 0.004822286556305222,\n \"acc_norm\": 0.8163712407886875,\n \"acc_norm_stderr\": 0.003863898546941602\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6452830188679245,\n \"acc_stderr\": 0.02944517532819959,\n \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.02944517532819959\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.032671518489247764,\n \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.032671518489247764\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.0416656757710158,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.0416656757710158\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6903225806451613,\n \"acc_stderr\": 0.026302774983517418,\n \"acc_norm\": 0.6903225806451613,\n \"acc_norm_stderr\": 0.026302774983517418\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.025106820660539753,\n \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.025106820660539753\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524586,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524586\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096625,\n \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096625\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.017149858514250948,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.017149858514250948\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.033809398139433545,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.033809398139433545\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n \"acc_stderr\": 0.025140935950335428,\n \"acc_norm\": 0.8205128205128205,\n \"acc_norm_stderr\": 0.025140935950335428\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.014866821664709583,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.014866821664709583\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.01435591196476786,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.01435591196476786\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666787,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666787\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.02623696588115326,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.02623696588115326\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579922,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579922\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4148936170212766,\n \"acc_stderr\": 0.0293922365846125,\n \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.0293922365846125\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43415906127770537,\n \"acc_stderr\": 0.012659033237067248,\n \"acc_norm\": 0.43415906127770537,\n \"acc_norm_stderr\": 0.012659033237067248\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.019691459052354015,\n \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.019691459052354015\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789855,\n \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789855\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368032,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368032\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2998776009791922,\n \"mc1_stderr\": 0.01604035296671362,\n \"mc2\": 0.43954153886534947,\n \"mc2_stderr\": 0.014894783303440727\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7458563535911602,\n \"acc_stderr\": 0.012236307219708267\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2623199393479909,\n \"acc_stderr\": 0.012116912419925704\n }\n}\n```", "repo_url": "https://huggingface.co/Zangs3011/mistral_7b_DolphinCoder", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-18-24.338382.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["**/details_harness|winogrande|5_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T17-18-24.338382.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T17_18_24.338382", "path": ["results_2023-12-23T17-18-24.338382.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T17-18-24.338382.parquet"]}]}]} | 2023-12-23T17:21:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Zangs3011/mistral_7b_DolphinCoder
Dataset automatically created during the evaluation run of model Zangs3011/mistral_7b_DolphinCoder on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T17:18:24.338382(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Zangs3011/mistral_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model Zangs3011/mistral_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:18:24.338382(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Zangs3011/mistral_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model Zangs3011/mistral_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:18:24.338382(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Zangs3011/mistral_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model Zangs3011/mistral_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T17:18:24.338382(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
f940716cb65cab11bdf22cecee7eb419b256a390 |
# Dataset Card for Evaluation run of maywell/PiVoT-SOLAR-10.7B-RP
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [maywell/PiVoT-SOLAR-10.7B-RP](https://huggingface.co/maywell/PiVoT-SOLAR-10.7B-RP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maywell__PiVoT-SOLAR-10.7B-RP",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T17:18:41.486751](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-SOLAR-10.7B-RP/blob/main/results_2023-12-23T17-18-41.486751.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6422920263377255,
"acc_stderr": 0.032175863348431345,
"acc_norm": 0.6457131729039554,
"acc_norm_stderr": 0.03281189062466209,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5654007756662209,
"mc2_stderr": 0.015376981315076
},
"harness|arc:challenge|25": {
"acc": 0.6160409556313993,
"acc_stderr": 0.01421244498065189,
"acc_norm": 0.6510238907849829,
"acc_norm_stderr": 0.0139289334613825
},
"harness|hellaswag|10": {
"acc": 0.6281617207727545,
"acc_stderr": 0.004823078145064966,
"acc_norm": 0.818263294164509,
"acc_norm_stderr": 0.0038483926569392434
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302064,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302064
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.0255250343824749,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.0255250343824749
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188716,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188716
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8383838383838383,
"acc_stderr": 0.026225919863629283,
"acc_norm": 0.8383838383838383,
"acc_norm_stderr": 0.026225919863629283
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121434,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121434
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083018,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.02983796238829194,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.02983796238829194
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461777,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461777
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931806,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931806
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.038935425188248475,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.038935425188248475
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092365,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424383,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424383
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36201117318435755,
"acc_stderr": 0.016073067350153087,
"acc_norm": 0.36201117318435755,
"acc_norm_stderr": 0.016073067350153087
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984824,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984824
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5013037809647979,
"acc_stderr": 0.012770192691057112,
"acc_norm": 0.5013037809647979,
"acc_norm_stderr": 0.012770192691057112
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.019393058402355442,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.019393058402355442
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5654007756662209,
"mc2_stderr": 0.015376981315076
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836682
},
"harness|gsm8k|5": {
"acc": 0.5382865807429871,
"acc_stderr": 0.013732048227016683
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_maywell__PiVoT-SOLAR-10.7B-RP | [
"region:us"
] | 2023-12-23T17:20:57+00:00 | {"pretty_name": "Evaluation run of maywell/PiVoT-SOLAR-10.7B-RP", "dataset_summary": "Dataset automatically created during the evaluation run of model [maywell/PiVoT-SOLAR-10.7B-RP](https://huggingface.co/maywell/PiVoT-SOLAR-10.7B-RP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__PiVoT-SOLAR-10.7B-RP\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T17:18:41.486751](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-SOLAR-10.7B-RP/blob/main/results_2023-12-23T17-18-41.486751.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6422920263377255,\n \"acc_stderr\": 0.032175863348431345,\n \"acc_norm\": 0.6457131729039554,\n \"acc_norm_stderr\": 0.03281189062466209,\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5654007756662209,\n \"mc2_stderr\": 0.015376981315076\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6160409556313993,\n \"acc_stderr\": 0.01421244498065189,\n \"acc_norm\": 0.6510238907849829,\n \"acc_norm_stderr\": 0.0139289334613825\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6281617207727545,\n \"acc_stderr\": 0.004823078145064966,\n \"acc_norm\": 0.818263294164509,\n \"acc_norm_stderr\": 0.0038483926569392434\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302064,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302064\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43386243386243384,\n \"acc_stderr\": 0.0255250343824749,\n \"acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.0255250343824749\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188716,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188716\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8383838383838383,\n \"acc_stderr\": 0.026225919863629283,\n \"acc_norm\": 0.8383838383838383,\n \"acc_norm_stderr\": 0.026225919863629283\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121434,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121434\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083018,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083018\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829194,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829194\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461777,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461777\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5787037037037037,\n \"acc_stderr\": 0.03367462138896078,\n \"acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.03367462138896078\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931806,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931806\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728745,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728745\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.038935425188248475,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.038935425188248475\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092365,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092365\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36201117318435755,\n \"acc_stderr\": 0.016073067350153087,\n \"acc_norm\": 0.36201117318435755,\n \"acc_norm_stderr\": 0.016073067350153087\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984824,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984824\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5013037809647979,\n \"acc_stderr\": 0.012770192691057112,\n \"acc_norm\": 0.5013037809647979,\n \"acc_norm_stderr\": 0.012770192691057112\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6421568627450981,\n \"acc_stderr\": 0.019393058402355442,\n \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.019393058402355442\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5654007756662209,\n \"mc2_stderr\": 0.015376981315076\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836682\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5382865807429871,\n \"acc_stderr\": 0.013732048227016683\n }\n}\n```", "repo_url": "https://huggingface.co/maywell/PiVoT-SOLAR-10.7B-RP", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-18-41.486751.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["**/details_harness|winogrande|5_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T17-18-41.486751.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T17_18_41.486751", "path": ["results_2023-12-23T17-18-41.486751.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T17-18-41.486751.parquet"]}]}]} | 2023-12-23T17:21:17+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of maywell/PiVoT-SOLAR-10.7B-RP
Dataset automatically created during the evaluation run of model maywell/PiVoT-SOLAR-10.7B-RP on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T17:18:41.486751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of maywell/PiVoT-SOLAR-10.7B-RP\n\n\n\nDataset automatically created during the evaluation run of model maywell/PiVoT-SOLAR-10.7B-RP on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:18:41.486751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of maywell/PiVoT-SOLAR-10.7B-RP\n\n\n\nDataset automatically created during the evaluation run of model maywell/PiVoT-SOLAR-10.7B-RP on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:18:41.486751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of maywell/PiVoT-SOLAR-10.7B-RP\n\n\n\nDataset automatically created during the evaluation run of model maywell/PiVoT-SOLAR-10.7B-RP on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T17:18:41.486751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
893e66ea2bd6a15c9b19ac7d69dbc95eb6d43353 |
# Dataset Card for Evaluation run of cookinai/DonutLM-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cookinai/DonutLM-v1](https://huggingface.co/cookinai/DonutLM-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cookinai__DonutLM-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T17:20:03.494171](https://huggingface.co/datasets/open-llm-leaderboard/details_cookinai__DonutLM-v1/blob/main/results_2023-12-23T17-20-03.494171.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6576690631133046,
"acc_stderr": 0.03192024452939422,
"acc_norm": 0.6585907567051082,
"acc_norm_stderr": 0.032571444037302465,
"mc1": 0.4602203182374541,
"mc1_stderr": 0.01744801722396088,
"mc2": 0.6336336766166446,
"mc2_stderr": 0.015095668911066656
},
"harness|arc:challenge|25": {
"acc": 0.659556313993174,
"acc_stderr": 0.013847460518892978,
"acc_norm": 0.6911262798634812,
"acc_norm_stderr": 0.013501770929344
},
"harness|hellaswag|10": {
"acc": 0.6667994423421629,
"acc_stderr": 0.004703942346762255,
"acc_norm": 0.8590918143796057,
"acc_norm_stderr": 0.003472157511639361
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634285,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8587155963302753,
"acc_stderr": 0.014933868987028072,
"acc_norm": 0.8587155963302753,
"acc_norm_stderr": 0.014933868987028072
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503224,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503224
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794087,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8378033205619413,
"acc_stderr": 0.013182222616720885,
"acc_norm": 0.8378033205619413,
"acc_norm_stderr": 0.013182222616720885
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.016558601636041035,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.016558601636041035
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6879084967320261,
"acc_stderr": 0.01874501120127766,
"acc_norm": 0.6879084967320261,
"acc_norm_stderr": 0.01874501120127766
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.02372983088101853,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.02372983088101853
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4602203182374541,
"mc1_stderr": 0.01744801722396088,
"mc2": 0.6336336766166446,
"mc2_stderr": 0.015095668911066656
},
"harness|winogrande|5": {
"acc": 0.8168902920284136,
"acc_stderr": 0.010869778633168367
},
"harness|gsm8k|5": {
"acc": 0.6679302501895376,
"acc_stderr": 0.012972465034361863
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cookinai__DonutLM-v1 | [
"region:us"
] | 2023-12-23T17:22:23+00:00 | {"pretty_name": "Evaluation run of cookinai/DonutLM-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [cookinai/DonutLM-v1](https://huggingface.co/cookinai/DonutLM-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cookinai__DonutLM-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T17:20:03.494171](https://huggingface.co/datasets/open-llm-leaderboard/details_cookinai__DonutLM-v1/blob/main/results_2023-12-23T17-20-03.494171.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6576690631133046,\n \"acc_stderr\": 0.03192024452939422,\n \"acc_norm\": 0.6585907567051082,\n \"acc_norm_stderr\": 0.032571444037302465,\n \"mc1\": 0.4602203182374541,\n \"mc1_stderr\": 0.01744801722396088,\n \"mc2\": 0.6336336766166446,\n \"mc2_stderr\": 0.015095668911066656\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.659556313993174,\n \"acc_stderr\": 0.013847460518892978,\n \"acc_norm\": 0.6911262798634812,\n \"acc_norm_stderr\": 0.013501770929344\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6667994423421629,\n \"acc_stderr\": 0.004703942346762255,\n \"acc_norm\": 0.8590918143796057,\n \"acc_norm_stderr\": 0.003472157511639361\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339525,\n \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339525\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634285,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634285\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8587155963302753,\n \"acc_stderr\": 0.014933868987028072,\n \"acc_norm\": 0.8587155963302753,\n \"acc_norm_stderr\": 0.014933868987028072\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503224,\n \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503224\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794087,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794087\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8378033205619413,\n \"acc_stderr\": 0.013182222616720885,\n \"acc_norm\": 0.8378033205619413,\n \"acc_norm_stderr\": 0.013182222616720885\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n \"acc_stderr\": 0.016558601636041035,\n \"acc_norm\": 0.4301675977653631,\n \"acc_norm_stderr\": 0.016558601636041035\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6879084967320261,\n \"acc_stderr\": 0.01874501120127766,\n \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.01874501120127766\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.02372983088101853,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.02372983088101853\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4602203182374541,\n \"mc1_stderr\": 0.01744801722396088,\n \"mc2\": 0.6336336766166446,\n \"mc2_stderr\": 0.015095668911066656\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8168902920284136,\n \"acc_stderr\": 0.010869778633168367\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6679302501895376,\n \"acc_stderr\": 0.012972465034361863\n }\n}\n```", "repo_url": "https://huggingface.co/cookinai/DonutLM-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-20-03.494171.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["**/details_harness|winogrande|5_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T17-20-03.494171.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T17_20_03.494171", "path": ["results_2023-12-23T17-20-03.494171.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T17-20-03.494171.parquet"]}]}]} | 2023-12-23T17:22:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of cookinai/DonutLM-v1
Dataset automatically created during the evaluation run of model cookinai/DonutLM-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T17:20:03.494171(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of cookinai/DonutLM-v1\n\n\n\nDataset automatically created during the evaluation run of model cookinai/DonutLM-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:20:03.494171(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of cookinai/DonutLM-v1\n\n\n\nDataset automatically created during the evaluation run of model cookinai/DonutLM-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:20:03.494171(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
179,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of cookinai/DonutLM-v1\n\n\n\nDataset automatically created during the evaluation run of model cookinai/DonutLM-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T17:20:03.494171(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
bbdd122bd2a8ac29e178447783659fb88d1f57c5 |
# Dataset Card for Evaluation run of BEE-spoke-data/smol_llama-220M-GQA
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BEE-spoke-data/smol_llama-220M-GQA](https://huggingface.co/BEE-spoke-data/smol_llama-220M-GQA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BEE-spoke-data__smol_llama-220M-GQA",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T17:30:41.856750](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__smol_llama-220M-GQA/blob/main/results_2023-12-23T17-30-41.856750.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25795142962640294,
"acc_stderr": 0.03080235085085749,
"acc_norm": 0.25898812897491635,
"acc_norm_stderr": 0.031585978490541275,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662588,
"mc2": 0.44553738237393864,
"mc2_stderr": 0.015343428436494402
},
"harness|arc:challenge|25": {
"acc": 0.2030716723549488,
"acc_stderr": 0.011755899303705582,
"acc_norm": 0.24829351535836178,
"acc_norm_stderr": 0.012624912868089762
},
"harness|hellaswag|10": {
"acc": 0.280920135431189,
"acc_stderr": 0.004485300194072271,
"acc_norm": 0.2976498705437164,
"acc_norm_stderr": 0.004562902604938731
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313139,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313139
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677088,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677088
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2943396226415094,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.2943396226415094,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.03063114553919882,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.03063114553919882
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.0433643270799318,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.0433643270799318
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28085106382978725,
"acc_stderr": 0.02937917046412482,
"acc_norm": 0.28085106382978725,
"acc_norm_stderr": 0.02937917046412482
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.19310344827586207,
"acc_stderr": 0.032894455221273995,
"acc_norm": 0.19310344827586207,
"acc_norm_stderr": 0.032894455221273995
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.02226181769240017,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.02226181769240017
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924315,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924315
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.03318477333845331,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.03318477333845331
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3471502590673575,
"acc_stderr": 0.034356961683613546,
"acc_norm": 0.3471502590673575,
"acc_norm_stderr": 0.034356961683613546
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3282051282051282,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.3282051282051282,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3319327731092437,
"acc_stderr": 0.030588697013783663,
"acc_norm": 0.3319327731092437,
"acc_norm_stderr": 0.030588697013783663
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.037804458505267334,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.037804458505267334
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3119266055045872,
"acc_stderr": 0.019862967976707245,
"acc_norm": 0.3119266055045872,
"acc_norm_stderr": 0.019862967976707245
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.17937219730941703,
"acc_stderr": 0.025749819569192783,
"acc_norm": 0.17937219730941703,
"acc_norm_stderr": 0.025749819569192783
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467764,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467764
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20306513409961685,
"acc_stderr": 0.01438552507661158,
"acc_norm": 0.20306513409961685,
"acc_norm_stderr": 0.01438552507661158
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.0222896388526179,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.0222896388526179
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.025261691219729487,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.025261691219729487
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.21543408360128619,
"acc_stderr": 0.023350225475471414,
"acc_norm": 0.21543408360128619,
"acc_norm_stderr": 0.023350225475471414
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.023016705640262203,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.023016705640262203
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.026789172351140242,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.026789172351140242
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24771838331160365,
"acc_stderr": 0.01102549929144374,
"acc_norm": 0.24771838331160365,
"acc_norm_stderr": 0.01102549929144374
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.016906615927288135,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.016906615927288135
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39183673469387753,
"acc_stderr": 0.03125127591089165,
"acc_norm": 0.39183673469387753,
"acc_norm_stderr": 0.03125127591089165
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-virology|5": {
"acc": 0.22289156626506024,
"acc_stderr": 0.03240004825594688,
"acc_norm": 0.22289156626506024,
"acc_norm_stderr": 0.03240004825594688
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662588,
"mc2": 0.44553738237393864,
"mc2_stderr": 0.015343428436494402
},
"harness|winogrande|5": {
"acc": 0.5098658247829518,
"acc_stderr": 0.014049749833367585
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.0022675371022544723
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BEE-spoke-data__smol_llama-220M-GQA | [
"region:us"
] | 2023-12-23T17:32:32+00:00 | {"pretty_name": "Evaluation run of BEE-spoke-data/smol_llama-220M-GQA", "dataset_summary": "Dataset automatically created during the evaluation run of model [BEE-spoke-data/smol_llama-220M-GQA](https://huggingface.co/BEE-spoke-data/smol_llama-220M-GQA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BEE-spoke-data__smol_llama-220M-GQA\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T17:30:41.856750](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__smol_llama-220M-GQA/blob/main/results_2023-12-23T17-30-41.856750.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25795142962640294,\n \"acc_stderr\": 0.03080235085085749,\n \"acc_norm\": 0.25898812897491635,\n \"acc_norm_stderr\": 0.031585978490541275,\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.015077219200662588,\n \"mc2\": 0.44553738237393864,\n \"mc2_stderr\": 0.015343428436494402\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2030716723549488,\n \"acc_stderr\": 0.011755899303705582,\n \"acc_norm\": 0.24829351535836178,\n \"acc_norm_stderr\": 0.012624912868089762\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.280920135431189,\n \"acc_stderr\": 0.004485300194072271,\n \"acc_norm\": 0.2976498705437164,\n \"acc_norm_stderr\": 0.004562902604938731\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313139,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313139\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677088,\n \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677088\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2943396226415094,\n \"acc_stderr\": 0.028049186315695248,\n \"acc_norm\": 0.2943396226415094,\n \"acc_norm_stderr\": 0.028049186315695248\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.0433643270799318,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.0433643270799318\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.02937917046412482,\n \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.02937917046412482\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.19310344827586207,\n \"acc_stderr\": 0.032894455221273995,\n \"acc_norm\": 0.19310344827586207,\n \"acc_norm_stderr\": 0.032894455221273995\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24867724867724866,\n \"acc_stderr\": 0.02226181769240017,\n \"acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.02226181769240017\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.03852273364924315,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.03852273364924315\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.24838709677419354,\n \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3181818181818182,\n \"acc_stderr\": 0.03318477333845331,\n \"acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.03318477333845331\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.3471502590673575,\n \"acc_stderr\": 0.034356961683613546,\n \"acc_norm\": 0.3471502590673575,\n \"acc_norm_stderr\": 0.034356961683613546\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3282051282051282,\n \"acc_stderr\": 0.023807633198657262,\n \"acc_norm\": 0.3282051282051282,\n \"acc_norm_stderr\": 0.023807633198657262\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3319327731092437,\n \"acc_stderr\": 0.030588697013783663,\n \"acc_norm\": 0.3319327731092437,\n \"acc_norm_stderr\": 0.030588697013783663\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.037804458505267334,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.037804458505267334\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3119266055045872,\n \"acc_stderr\": 0.019862967976707245,\n \"acc_norm\": 0.3119266055045872,\n \"acc_norm_stderr\": 0.019862967976707245\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.034076320938540516,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.034076320938540516\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2911392405063291,\n \"acc_stderr\": 0.02957160106575337,\n \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.02957160106575337\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.17937219730941703,\n \"acc_stderr\": 0.025749819569192783,\n \"acc_norm\": 0.17937219730941703,\n \"acc_norm_stderr\": 0.025749819569192783\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n \"acc_stderr\": 0.04203277291467764,\n \"acc_norm\": 0.26785714285714285,\n \"acc_norm_stderr\": 0.04203277291467764\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20306513409961685,\n \"acc_stderr\": 0.01438552507661158,\n \"acc_norm\": 0.20306513409961685,\n \"acc_norm_stderr\": 0.01438552507661158\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.0222896388526179,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.0222896388526179\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.025261691219729487,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.025261691219729487\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21543408360128619,\n \"acc_stderr\": 0.023350225475471414,\n \"acc_norm\": 0.21543408360128619,\n \"acc_norm_stderr\": 0.023350225475471414\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.023016705640262203,\n \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.023016705640262203\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2801418439716312,\n \"acc_stderr\": 0.026789172351140242,\n \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.026789172351140242\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24771838331160365,\n \"acc_stderr\": 0.01102549929144374,\n \"acc_norm\": 0.24771838331160365,\n \"acc_norm_stderr\": 0.01102549929144374\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.016906615927288135,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.016906615927288135\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.39183673469387753,\n \"acc_stderr\": 0.03125127591089165,\n \"acc_norm\": 0.39183673469387753,\n \"acc_norm_stderr\": 0.03125127591089165\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.22289156626506024,\n \"acc_stderr\": 0.03240004825594688,\n \"acc_norm\": 0.22289156626506024,\n \"acc_norm_stderr\": 0.03240004825594688\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03188578017686398,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03188578017686398\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.015077219200662588,\n \"mc2\": 0.44553738237393864,\n \"mc2_stderr\": 0.015343428436494402\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5098658247829518,\n \"acc_stderr\": 0.014049749833367585\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006823351023502654,\n \"acc_stderr\": 0.0022675371022544723\n }\n}\n```", "repo_url": "https://huggingface.co/BEE-spoke-data/smol_llama-220M-GQA", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-30-41.856750.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["**/details_harness|winogrande|5_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T17-30-41.856750.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T17_30_41.856750", "path": ["results_2023-12-23T17-30-41.856750.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T17-30-41.856750.parquet"]}]}]} | 2023-12-23T17:32:54+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BEE-spoke-data/smol_llama-220M-GQA
Dataset automatically created during the evaluation run of model BEE-spoke-data/smol_llama-220M-GQA on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T17:30:41.856750(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BEE-spoke-data/smol_llama-220M-GQA\n\n\n\nDataset automatically created during the evaluation run of model BEE-spoke-data/smol_llama-220M-GQA on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:30:41.856750(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BEE-spoke-data/smol_llama-220M-GQA\n\n\n\nDataset automatically created during the evaluation run of model BEE-spoke-data/smol_llama-220M-GQA on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:30:41.856750(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
199,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of BEE-spoke-data/smol_llama-220M-GQA\n\n\n\nDataset automatically created during the evaluation run of model BEE-spoke-data/smol_llama-220M-GQA on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T17:30:41.856750(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
b68a7585886846248c093277e6e9c2e83c79c247 | # Dataset Card for "fatti-e-misfatti"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mii-llm/fatti-e-misfatti | [
"region:us"
] | 2023-12-23T17:38:18+00:00 | {"dataset_info": {"features": [{"name": "system", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 37476242, "num_examples": 25728}], "download_size": 22650262, "dataset_size": 37476242}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-23T17:38:31+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "fatti-e-misfatti"
More Information needed | [
"# Dataset Card for \"fatti-e-misfatti\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"fatti-e-misfatti\"\n\nMore Information needed"
] | [
6,
18
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"fatti-e-misfatti\"\n\nMore Information needed"
] |
6c934d775c319a4a47a40edda56a19782fd52a1e |
# Dataset Card for Evaluation run of mediocredev/open-llama-3b-v2-chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mediocredev/open-llama-3b-v2-chat](https://huggingface.co/mediocredev/open-llama-3b-v2-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mediocredev__open-llama-3b-v2-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T17:37:59.666362](https://huggingface.co/datasets/open-llm-leaderboard/details_mediocredev__open-llama-3b-v2-chat/blob/main/results_2023-12-23T17-37-59.666362.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.29439428040860166,
"acc_stderr": 0.03226802892374593,
"acc_norm": 0.2963863014813873,
"acc_norm_stderr": 0.03305754972277571,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080512,
"mc2": 0.3783540937373688,
"mc2_stderr": 0.014134579864323335
},
"harness|arc:challenge|25": {
"acc": 0.37627986348122866,
"acc_stderr": 0.014157022555407173,
"acc_norm": 0.4061433447098976,
"acc_norm_stderr": 0.01435165669009786
},
"harness|hellaswag|10": {
"acc": 0.5233021310495917,
"acc_stderr": 0.00498435966995192,
"acc_norm": 0.7030472017526389,
"acc_norm_stderr": 0.004559817589182068
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.037827289808654685,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.037827289808654685
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.30943396226415093,
"acc_stderr": 0.028450154794118627,
"acc_norm": 0.30943396226415093,
"acc_norm_stderr": 0.028450154794118627
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.034355680560478746,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.034355680560478746
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2723404255319149,
"acc_stderr": 0.029101290698386705,
"acc_norm": 0.2723404255319149,
"acc_norm_stderr": 0.029101290698386705
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.037528339580033376,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.037528339580033376
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.02339382650048487,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.02339382650048487
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604673,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604673
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.02489246917246285,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.02489246917246285
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114475,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114475
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.035886248000917075,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.035886248000917075
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.31313131313131315,
"acc_stderr": 0.03304205087813653,
"acc_norm": 0.31313131313131315,
"acc_norm_stderr": 0.03304205087813653
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3,
"acc_stderr": 0.023234581088428484,
"acc_norm": 0.3,
"acc_norm_stderr": 0.023234581088428484
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.02549753263960954,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.02549753263960954
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25210084033613445,
"acc_stderr": 0.028205545033277723,
"acc_norm": 0.25210084033613445,
"acc_norm_stderr": 0.028205545033277723
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25321100917431194,
"acc_stderr": 0.01864407304137504,
"acc_norm": 0.25321100917431194,
"acc_norm_stderr": 0.01864407304137504
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686186,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.03191923445686186
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.028379449451588674,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.028379449451588674
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3206751054852321,
"acc_stderr": 0.030381931949990403,
"acc_norm": 0.3206751054852321,
"acc_norm_stderr": 0.030381931949990403
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3721973094170404,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.3721973094170404,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.38016528925619836,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.38016528925619836,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326469,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326469
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267406,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267406
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3090676883780332,
"acc_stderr": 0.016524988919702187,
"acc_norm": 0.3090676883780332,
"acc_norm_stderr": 0.016524988919702187
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.01424263007057489,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.01424263007057489
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2973856209150327,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.2973856209150327,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2861736334405145,
"acc_stderr": 0.02567025924218894,
"acc_norm": 0.2861736334405145,
"acc_norm_stderr": 0.02567025924218894
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.026041766202717163,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.026041766202717163
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307857,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307857
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24119947848761408,
"acc_stderr": 0.010926496102034961,
"acc_norm": 0.24119947848761408,
"acc_norm_stderr": 0.010926496102034961
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22794117647058823,
"acc_stderr": 0.025483081468029804,
"acc_norm": 0.22794117647058823,
"acc_norm_stderr": 0.025483081468029804
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.01784808957491323,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.01784808957491323
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910508,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.33877551020408164,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.33877551020408164,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3034825870646766,
"acc_stderr": 0.03251006816458618,
"acc_norm": 0.3034825870646766,
"acc_norm_stderr": 0.03251006816458618
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.03410646614071857,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.03410646614071857
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824565,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824565
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080512,
"mc2": 0.3783540937373688,
"mc2_stderr": 0.014134579864323335
},
"harness|winogrande|5": {
"acc": 0.6550907655880032,
"acc_stderr": 0.013359379805033692
},
"harness|gsm8k|5": {
"acc": 0.02577710386656558,
"acc_stderr": 0.004365042953621805
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_mediocredev__open-llama-3b-v2-chat | [
"region:us"
] | 2023-12-23T17:39:42+00:00 | {"pretty_name": "Evaluation run of mediocredev/open-llama-3b-v2-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [mediocredev/open-llama-3b-v2-chat](https://huggingface.co/mediocredev/open-llama-3b-v2-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mediocredev__open-llama-3b-v2-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T17:37:59.666362](https://huggingface.co/datasets/open-llm-leaderboard/details_mediocredev__open-llama-3b-v2-chat/blob/main/results_2023-12-23T17-37-59.666362.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.29439428040860166,\n \"acc_stderr\": 0.03226802892374593,\n \"acc_norm\": 0.2963863014813873,\n \"acc_norm_stderr\": 0.03305754972277571,\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.014789157531080512,\n \"mc2\": 0.3783540937373688,\n \"mc2_stderr\": 0.014134579864323335\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.37627986348122866,\n \"acc_stderr\": 0.014157022555407173,\n \"acc_norm\": 0.4061433447098976,\n \"acc_norm_stderr\": 0.01435165669009786\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5233021310495917,\n \"acc_stderr\": 0.00498435966995192,\n \"acc_norm\": 0.7030472017526389,\n \"acc_norm_stderr\": 0.004559817589182068\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.037827289808654685,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.037827289808654685\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.30943396226415093,\n \"acc_stderr\": 0.028450154794118627,\n \"acc_norm\": 0.30943396226415093,\n \"acc_norm_stderr\": 0.028450154794118627\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.034355680560478746,\n \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.034355680560478746\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2723404255319149,\n \"acc_stderr\": 0.029101290698386705,\n \"acc_norm\": 0.2723404255319149,\n \"acc_norm_stderr\": 0.029101290698386705\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.037528339580033376,\n \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.037528339580033376\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.291005291005291,\n \"acc_stderr\": 0.02339382650048487,\n \"acc_norm\": 0.291005291005291,\n \"acc_norm_stderr\": 0.02339382650048487\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604673,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.03893259610604673\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n \"acc_stderr\": 0.02489246917246285,\n \"acc_norm\": 0.25806451612903225,\n \"acc_norm_stderr\": 0.02489246917246285\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114475,\n \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114475\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.30303030303030304,\n \"acc_stderr\": 0.035886248000917075,\n \"acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.035886248000917075\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.31313131313131315,\n \"acc_stderr\": 0.03304205087813653,\n \"acc_norm\": 0.31313131313131315,\n \"acc_norm_stderr\": 0.03304205087813653\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.023234581088428484,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.023234581088428484\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.22592592592592592,\n \"acc_stderr\": 0.02549753263960954,\n \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.02549753263960954\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.25210084033613445,\n \"acc_stderr\": 0.028205545033277723,\n \"acc_norm\": 0.25210084033613445,\n \"acc_norm_stderr\": 0.028205545033277723\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.25321100917431194,\n \"acc_stderr\": 0.01864407304137504,\n \"acc_norm\": 0.25321100917431194,\n \"acc_norm_stderr\": 0.01864407304137504\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686186,\n \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686186\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.028379449451588674,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.028379449451588674\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.3206751054852321,\n \"acc_stderr\": 0.030381931949990403,\n \"acc_norm\": 0.3206751054852321,\n \"acc_norm_stderr\": 0.030381931949990403\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.3721973094170404,\n \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728744,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728744\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968432,\n \"acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968432\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326469,\n \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326469\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267406,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267406\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3090676883780332,\n \"acc_stderr\": 0.016524988919702187,\n \"acc_norm\": 0.3090676883780332,\n \"acc_norm_stderr\": 0.016524988919702187\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.024257901705323378,\n \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.024257901705323378\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.01424263007057489,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.01424263007057489\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2973856209150327,\n \"acc_stderr\": 0.026173908506718576,\n \"acc_norm\": 0.2973856209150327,\n \"acc_norm_stderr\": 0.026173908506718576\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2861736334405145,\n \"acc_stderr\": 0.02567025924218894,\n \"acc_norm\": 0.2861736334405145,\n \"acc_norm_stderr\": 0.02567025924218894\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.026041766202717163,\n \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.026041766202717163\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307857,\n \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307857\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24119947848761408,\n \"acc_stderr\": 0.010926496102034961,\n \"acc_norm\": 0.24119947848761408,\n \"acc_norm_stderr\": 0.010926496102034961\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.22794117647058823,\n \"acc_stderr\": 0.025483081468029804,\n \"acc_norm\": 0.22794117647058823,\n \"acc_norm_stderr\": 0.025483081468029804\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.01784808957491323,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.01784808957491323\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.33877551020408164,\n \"acc_stderr\": 0.030299506562154185,\n \"acc_norm\": 0.33877551020408164,\n \"acc_norm_stderr\": 0.030299506562154185\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3034825870646766,\n \"acc_stderr\": 0.03251006816458618,\n \"acc_norm\": 0.3034825870646766,\n \"acc_norm_stderr\": 0.03251006816458618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n \"acc_stderr\": 0.03410646614071857,\n \"acc_norm\": 0.25903614457831325,\n \"acc_norm_stderr\": 0.03410646614071857\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824565,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824565\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.014789157531080512,\n \"mc2\": 0.3783540937373688,\n \"mc2_stderr\": 0.014134579864323335\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6550907655880032,\n \"acc_stderr\": 0.013359379805033692\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02577710386656558,\n \"acc_stderr\": 0.004365042953621805\n }\n}\n```", "repo_url": "https://huggingface.co/mediocredev/open-llama-3b-v2-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-37-59.666362.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["**/details_harness|winogrande|5_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T17-37-59.666362.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T17_37_59.666362", "path": ["results_2023-12-23T17-37-59.666362.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T17-37-59.666362.parquet"]}]}]} | 2023-12-23T17:40:01+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mediocredev/open-llama-3b-v2-chat
Dataset automatically created during the evaluation run of model mediocredev/open-llama-3b-v2-chat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T17:37:59.666362(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of mediocredev/open-llama-3b-v2-chat\n\n\n\nDataset automatically created during the evaluation run of model mediocredev/open-llama-3b-v2-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:37:59.666362(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mediocredev/open-llama-3b-v2-chat\n\n\n\nDataset automatically created during the evaluation run of model mediocredev/open-llama-3b-v2-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:37:59.666362(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mediocredev/open-llama-3b-v2-chat\n\n\n\nDataset automatically created during the evaluation run of model mediocredev/open-llama-3b-v2-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T17:37:59.666362(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
ff7d064556ea997c61e249ca2fd253fb5af91c70 |
# Dataset Card for Evaluation run of cloudyu/mixtral_7bx4_moe
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/mixtral_7bx4_moe](https://huggingface.co/cloudyu/mixtral_7bx4_moe) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__mixtral_7bx4_moe",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T17:37:28.145090](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__mixtral_7bx4_moe/blob/main/results_2023-12-23T17-37-28.145090.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6311139010801706,
"acc_stderr": 0.03229082356266579,
"acc_norm": 0.632622270079106,
"acc_norm_stderr": 0.0329353580988297,
"mc1": 0.423500611995104,
"mc1_stderr": 0.017297421448534727,
"mc2": 0.5985125569293038,
"mc2_stderr": 0.015744189058578734
},
"harness|arc:challenge|25": {
"acc": 0.6126279863481229,
"acc_stderr": 0.014235872487909865,
"acc_norm": 0.6527303754266212,
"acc_norm_stderr": 0.013913034529620451
},
"harness|hellaswag|10": {
"acc": 0.6685919139613623,
"acc_stderr": 0.00469757396216943,
"acc_norm": 0.8528181637124079,
"acc_norm_stderr": 0.0035356302890914566
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438655,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138215,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138215
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932022,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932022
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.02428314052946731,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.02428314052946731
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931666,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931666
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010323,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867443,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867443
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.01362555690799345,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.01362555690799345
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577615,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577615
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3486033519553073,
"acc_stderr": 0.015937484656687033,
"acc_norm": 0.3486033519553073,
"acc_norm_stderr": 0.015937484656687033
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495036,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495036
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032205,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032205
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.019333142020797164,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.019333142020797164
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399673,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399673
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.423500611995104,
"mc1_stderr": 0.017297421448534727,
"mc2": 0.5985125569293038,
"mc2_stderr": 0.015744189058578734
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.0117056975652052
},
"harness|gsm8k|5": {
"acc": 0.6209249431387415,
"acc_stderr": 0.013363630295088361
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cloudyu__mixtral_7bx4_moe | [
"region:us"
] | 2023-12-23T17:39:45+00:00 | {"pretty_name": "Evaluation run of cloudyu/mixtral_7bx4_moe", "dataset_summary": "Dataset automatically created during the evaluation run of model [cloudyu/mixtral_7bx4_moe](https://huggingface.co/cloudyu/mixtral_7bx4_moe) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__mixtral_7bx4_moe\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T17:37:28.145090](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__mixtral_7bx4_moe/blob/main/results_2023-12-23T17-37-28.145090.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6311139010801706,\n \"acc_stderr\": 0.03229082356266579,\n \"acc_norm\": 0.632622270079106,\n \"acc_norm_stderr\": 0.0329353580988297,\n \"mc1\": 0.423500611995104,\n \"mc1_stderr\": 0.017297421448534727,\n \"mc2\": 0.5985125569293038,\n \"mc2_stderr\": 0.015744189058578734\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6126279863481229,\n \"acc_stderr\": 0.014235872487909865,\n \"acc_norm\": 0.6527303754266212,\n \"acc_norm_stderr\": 0.013913034529620451\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6685919139613623,\n \"acc_stderr\": 0.00469757396216943,\n \"acc_norm\": 0.8528181637124079,\n \"acc_norm_stderr\": 0.0035356302890914566\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438655,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138215,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138215\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.02428314052946731,\n \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.02428314052946731\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931666,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931666\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010323,\n \"acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010323\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n \"acc_stderr\": 0.019875655027867443,\n \"acc_norm\": 0.8974358974358975,\n \"acc_norm_stderr\": 0.019875655027867443\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.01362555690799345,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.01362555690799345\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577615,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577615\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3486033519553073,\n \"acc_stderr\": 0.015937484656687033,\n \"acc_norm\": 0.3486033519553073,\n \"acc_norm_stderr\": 0.015937484656687033\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495036,\n \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495036\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n \"acc_stderr\": 0.012719949543032205,\n \"acc_norm\": 0.4556714471968709,\n \"acc_norm_stderr\": 0.012719949543032205\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.019333142020797164,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.019333142020797164\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.423500611995104,\n \"mc1_stderr\": 0.017297421448534727,\n \"mc2\": 0.5985125569293038,\n \"mc2_stderr\": 0.015744189058578734\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.0117056975652052\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6209249431387415,\n \"acc_stderr\": 0.013363630295088361\n }\n}\n```", "repo_url": "https://huggingface.co/cloudyu/mixtral_7bx4_moe", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-37-28.145090.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["**/details_harness|winogrande|5_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T17-37-28.145090.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T17_37_28.145090", "path": ["results_2023-12-23T17-37-28.145090.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T17-37-28.145090.parquet"]}]}]} | 2023-12-23T17:40:04+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of cloudyu/mixtral_7bx4_moe
Dataset automatically created during the evaluation run of model cloudyu/mixtral_7bx4_moe on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T17:37:28.145090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of cloudyu/mixtral_7bx4_moe\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/mixtral_7bx4_moe on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:37:28.145090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of cloudyu/mixtral_7bx4_moe\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/mixtral_7bx4_moe on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:37:28.145090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of cloudyu/mixtral_7bx4_moe\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/mixtral_7bx4_moe on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T17:37:28.145090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
7e39517560abc7bff013032e4559c659495260b7 |
# Dataset Card for Evaluation run of sumo43/SOLAR-10.7B-Instruct-DPO-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sumo43/SOLAR-10.7B-Instruct-DPO-v1.0](https://huggingface.co/sumo43/SOLAR-10.7B-Instruct-DPO-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sumo43__SOLAR-10.7B-Instruct-DPO-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T17:50:48.785620](https://huggingface.co/datasets/open-llm-leaderboard/details_sumo43__SOLAR-10.7B-Instruct-DPO-v1.0/blob/main/results_2023-12-23T17-50-48.785620.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6430537810273164,
"acc_stderr": 0.032247593325984356,
"acc_norm": 0.6479435232898493,
"acc_norm_stderr": 0.03290230073791138,
"mc1": 0.5887392900856793,
"mc1_stderr": 0.017225627083660877,
"mc2": 0.7327172525156499,
"mc2_stderr": 0.014785133893758778
},
"harness|arc:challenge|25": {
"acc": 0.6988054607508533,
"acc_stderr": 0.013406741767847626,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.01295506596371069
},
"harness|hellaswag|10": {
"acc": 0.743079067914758,
"acc_stderr": 0.0043604245361451195,
"acc_norm": 0.8977295359490142,
"acc_norm_stderr": 0.0030238440318883834
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48677248677248675,
"acc_stderr": 0.025742297289575142,
"acc_norm": 0.48677248677248675,
"acc_norm_stderr": 0.025742297289575142
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.02519092111460391,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.02519092111460391
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590175,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590175
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8354430379746836,
"acc_stderr": 0.024135736240566932,
"acc_norm": 0.8354430379746836,
"acc_norm_stderr": 0.024135736240566932
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707779,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707779
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7969348659003831,
"acc_stderr": 0.01438552507661157,
"acc_norm": 0.7969348659003831,
"acc_norm_stderr": 0.01438552507661157
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.01654788799741611,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.01654788799741611
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.026992544339297236,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.026992544339297236
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.02698147804364803,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.02698147804364803
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7746913580246914,
"acc_stderr": 0.023246202647819753,
"acc_norm": 0.7746913580246914,
"acc_norm_stderr": 0.023246202647819753
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4784876140808344,
"acc_stderr": 0.012758410941038923,
"acc_norm": 0.4784876140808344,
"acc_norm_stderr": 0.012758410941038923
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7536764705882353,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.7536764705882353,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.018850084696468723,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.018850084696468723
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982055,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982055
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.02740385941078685,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.02740385941078685
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.032744852119469564,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.032744852119469564
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5887392900856793,
"mc1_stderr": 0.017225627083660877,
"mc2": 0.7327172525156499,
"mc2_stderr": 0.014785133893758778
},
"harness|winogrande|5": {
"acc": 0.819258089976322,
"acc_stderr": 0.01081491100961398
},
"harness|gsm8k|5": {
"acc": 0.36542835481425323,
"acc_stderr": 0.013264282030266637
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_sumo43__SOLAR-10.7B-Instruct-DPO-v1.0 | [
"region:us"
] | 2023-12-23T17:53:04+00:00 | {"pretty_name": "Evaluation run of sumo43/SOLAR-10.7B-Instruct-DPO-v1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [sumo43/SOLAR-10.7B-Instruct-DPO-v1.0](https://huggingface.co/sumo43/SOLAR-10.7B-Instruct-DPO-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sumo43__SOLAR-10.7B-Instruct-DPO-v1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T17:50:48.785620](https://huggingface.co/datasets/open-llm-leaderboard/details_sumo43__SOLAR-10.7B-Instruct-DPO-v1.0/blob/main/results_2023-12-23T17-50-48.785620.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6430537810273164,\n \"acc_stderr\": 0.032247593325984356,\n \"acc_norm\": 0.6479435232898493,\n \"acc_norm_stderr\": 0.03290230073791138,\n \"mc1\": 0.5887392900856793,\n \"mc1_stderr\": 0.017225627083660877,\n \"mc2\": 0.7327172525156499,\n \"mc2_stderr\": 0.014785133893758778\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6988054607508533,\n \"acc_stderr\": 0.013406741767847626,\n \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.01295506596371069\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.743079067914758,\n \"acc_stderr\": 0.0043604245361451195,\n \"acc_norm\": 0.8977295359490142,\n \"acc_norm_stderr\": 0.0030238440318883834\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.543859649122807,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.543859649122807,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.48677248677248675,\n \"acc_stderr\": 0.025742297289575142,\n \"acc_norm\": 0.48677248677248675,\n \"acc_norm_stderr\": 0.025742297289575142\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8535353535353535,\n \"acc_stderr\": 0.02519092111460391,\n \"acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.02519092111460391\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590175,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590175\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8354430379746836,\n \"acc_stderr\": 0.024135736240566932,\n \"acc_norm\": 0.8354430379746836,\n \"acc_norm_stderr\": 0.024135736240566932\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.02250903393707779,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.02250903393707779\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7969348659003831,\n \"acc_stderr\": 0.01438552507661157,\n \"acc_norm\": 0.7969348659003831,\n \"acc_norm_stderr\": 0.01438552507661157\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.026992544339297236,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.026992544339297236\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n \"acc_stderr\": 0.02698147804364803,\n \"acc_norm\": 0.6559485530546624,\n \"acc_norm_stderr\": 0.02698147804364803\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7746913580246914,\n \"acc_stderr\": 0.023246202647819753,\n \"acc_norm\": 0.7746913580246914,\n \"acc_norm_stderr\": 0.023246202647819753\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4784876140808344,\n \"acc_stderr\": 0.012758410941038923,\n \"acc_norm\": 0.4784876140808344,\n \"acc_norm_stderr\": 0.012758410941038923\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7536764705882353,\n \"acc_stderr\": 0.02617343857052,\n \"acc_norm\": 0.7536764705882353,\n \"acc_norm_stderr\": 0.02617343857052\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.018850084696468723,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.018850084696468723\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982055,\n \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982055\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.02740385941078685,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.02740385941078685\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.032744852119469564,\n \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.032744852119469564\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5887392900856793,\n \"mc1_stderr\": 0.017225627083660877,\n \"mc2\": 0.7327172525156499,\n \"mc2_stderr\": 0.014785133893758778\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.01081491100961398\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36542835481425323,\n \"acc_stderr\": 0.013264282030266637\n }\n}\n```", "repo_url": "https://huggingface.co/sumo43/SOLAR-10.7B-Instruct-DPO-v1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T17-50-48.785620.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["**/details_harness|winogrande|5_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T17-50-48.785620.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T17_50_48.785620", "path": ["results_2023-12-23T17-50-48.785620.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T17-50-48.785620.parquet"]}]}]} | 2023-12-23T17:53:24+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of sumo43/SOLAR-10.7B-Instruct-DPO-v1.0
Dataset automatically created during the evaluation run of model sumo43/SOLAR-10.7B-Instruct-DPO-v1.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T17:50:48.785620(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of sumo43/SOLAR-10.7B-Instruct-DPO-v1.0\n\n\n\nDataset automatically created during the evaluation run of model sumo43/SOLAR-10.7B-Instruct-DPO-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:50:48.785620(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of sumo43/SOLAR-10.7B-Instruct-DPO-v1.0\n\n\n\nDataset automatically created during the evaluation run of model sumo43/SOLAR-10.7B-Instruct-DPO-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T17:50:48.785620(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
199,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of sumo43/SOLAR-10.7B-Instruct-DPO-v1.0\n\n\n\nDataset automatically created during the evaluation run of model sumo43/SOLAR-10.7B-Instruct-DPO-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T17:50:48.785620(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
1cf075610413b1bbff39bb035a8e015050b3dc60 |
# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx4_MOE_24B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/Mixtral_7Bx4_MOE_24B](https://huggingface.co/cloudyu/Mixtral_7Bx4_MOE_24B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__Mixtral_7Bx4_MOE_24B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T18:05:51.243288](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Mixtral_7Bx4_MOE_24B/blob/main/results_2023-12-23T18-05-51.243288.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6322199879229019,
"acc_stderr": 0.03229738563088343,
"acc_norm": 0.6337436892396372,
"acc_norm_stderr": 0.03294310301937023,
"mc1": 0.423500611995104,
"mc1_stderr": 0.017297421448534727,
"mc2": 0.5978275429044729,
"mc2_stderr": 0.015733742788933292
},
"harness|arc:challenge|25": {
"acc": 0.6143344709897611,
"acc_stderr": 0.014224250973257187,
"acc_norm": 0.6535836177474402,
"acc_norm_stderr": 0.013905011180063232
},
"harness|hellaswag|10": {
"acc": 0.6683927504481179,
"acc_stderr": 0.004698285350019217,
"acc_norm": 0.852320254929297,
"acc_norm_stderr": 0.0035405716545956313
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438655,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.02542483508692401,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.02542483508692401
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.024137632429337714,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.024137632429337714
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932022,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932022
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.02428314052946731,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.02428314052946731
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.02849346509102859,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.02849346509102859
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099857,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099857
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990932,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990932
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.0134682016140663,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.0134682016140663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323374,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323374
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35977653631284917,
"acc_stderr": 0.016051419760310267,
"acc_norm": 0.35977653631284917,
"acc_norm_stderr": 0.016051419760310267
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.02532988817190093,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.02532988817190093
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.02970045324729146,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.02970045324729146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45371577574967403,
"acc_stderr": 0.012715404841277743,
"acc_norm": 0.45371577574967403,
"acc_norm_stderr": 0.012715404841277743
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.028888193103988633,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.028888193103988633
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.0193733324207245,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.0193733324207245
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399673,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399673
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.423500611995104,
"mc1_stderr": 0.017297421448534727,
"mc2": 0.5978275429044729,
"mc2_stderr": 0.015733742788933292
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.6171341925701289,
"acc_stderr": 0.013389223491820474
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cloudyu__Mixtral_7Bx4_MOE_24B | [
"region:us"
] | 2023-12-23T18:08:09+00:00 | {"pretty_name": "Evaluation run of cloudyu/Mixtral_7Bx4_MOE_24B", "dataset_summary": "Dataset automatically created during the evaluation run of model [cloudyu/Mixtral_7Bx4_MOE_24B](https://huggingface.co/cloudyu/Mixtral_7Bx4_MOE_24B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__Mixtral_7Bx4_MOE_24B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T18:05:51.243288](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Mixtral_7Bx4_MOE_24B/blob/main/results_2023-12-23T18-05-51.243288.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6322199879229019,\n \"acc_stderr\": 0.03229738563088343,\n \"acc_norm\": 0.6337436892396372,\n \"acc_norm_stderr\": 0.03294310301937023,\n \"mc1\": 0.423500611995104,\n \"mc1_stderr\": 0.017297421448534727,\n \"mc2\": 0.5978275429044729,\n \"mc2_stderr\": 0.015733742788933292\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6143344709897611,\n \"acc_stderr\": 0.014224250973257187,\n \"acc_norm\": 0.6535836177474402,\n \"acc_norm_stderr\": 0.013905011180063232\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6683927504481179,\n \"acc_stderr\": 0.004698285350019217,\n \"acc_norm\": 0.852320254929297,\n \"acc_norm_stderr\": 0.0035405716545956313\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438655,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.02542483508692401,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.02542483508692401\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.024137632429337714,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.024137632429337714\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.02428314052946731,\n \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.02428314052946731\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.02849346509102859,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.02849346509102859\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099857,\n \"acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099857\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.020237149008990932,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.020237149008990932\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.0134682016140663,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.0134682016140663\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323374,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323374\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35977653631284917,\n \"acc_stderr\": 0.016051419760310267,\n \"acc_norm\": 0.35977653631284917,\n \"acc_norm_stderr\": 0.016051419760310267\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.02532988817190093,\n \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.02532988817190093\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.02970045324729146,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.02970045324729146\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n \"acc_stderr\": 0.012715404841277743,\n \"acc_norm\": 0.45371577574967403,\n \"acc_norm_stderr\": 0.012715404841277743\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988633,\n \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988633\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6437908496732027,\n \"acc_stderr\": 0.0193733324207245,\n \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.0193733324207245\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.423500611995104,\n \"mc1_stderr\": 0.017297421448534727,\n \"mc2\": 0.5978275429044729,\n \"mc2_stderr\": 0.015733742788933292\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6171341925701289,\n \"acc_stderr\": 0.013389223491820474\n }\n}\n```", "repo_url": "https://huggingface.co/cloudyu/Mixtral_7Bx4_MOE_24B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|arc:challenge|25_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|gsm8k|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hellaswag|10_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T18-05-51.243288.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["**/details_harness|winogrande|5_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T18-05-51.243288.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T18_05_51.243288", "path": ["results_2023-12-23T18-05-51.243288.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T18-05-51.243288.parquet"]}]}]} | 2023-12-23T18:08:30+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx4_MOE_24B
Dataset automatically created during the evaluation run of model cloudyu/Mixtral_7Bx4_MOE_24B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T18:05:51.243288(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx4_MOE_24B\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Mixtral_7Bx4_MOE_24B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T18:05:51.243288(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx4_MOE_24B\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Mixtral_7Bx4_MOE_24B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T18:05:51.243288(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx4_MOE_24B\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Mixtral_7Bx4_MOE_24B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T18:05:51.243288(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
d647bb8f10436df68a9aca65ffafec3aa2b95863 |
[ZINC20](https://zinc20.docking.org/) Dataset with [SELFIES](https://arxiv.org/abs/1905.13741) added. Any smile that could not be successfully converted was dropped from the dataset.
Every tranch was downloaded, this is not the ~1B example ML subset from https://files.docking.org/zinc20-ML/.
The dataset was entirely shuffled then split into 80%/10%/10% splits for train/val/test.
A file vocab.csv is in the root of the reposity that contains all of the SELFIES tokens found in the data, with [START], [STOP], and [PAD] added. | haydn-jones/ZINC20 | [
"size_categories:1B<n<10B",
"license:mit",
"chemistry",
"biology",
"medical",
"arxiv:1905.13741",
"region:us"
] | 2023-12-23T18:08:17+00:00 | {"license": "mit", "size_categories": ["1B<n<10B"], "dataset_info": {"features": [{"name": "smiles", "dtype": "large_string"}, {"name": "zinc_id", "dtype": "int64"}, {"name": "SELFIES", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 393170565049, "num_examples": 1538340669}, {"name": "val", "num_bytes": 47753116448, "num_examples": 192292584}, {"name": "test", "num_bytes": 46114402425, "num_examples": 192292584}], "download_size": 174349539018, "dataset_size": 487038083922}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "val", "path": "data/val-*"}, {"split": "test", "path": "data/test-*"}]}], "tags": ["chemistry", "biology", "medical"]} | 2023-12-24T01:06:14+00:00 | [
"1905.13741"
] | [] | TAGS
#size_categories-1B<n<10B #license-mit #chemistry #biology #medical #arxiv-1905.13741 #region-us
|
ZINC20 Dataset with SELFIES added. Any smile that could not be successfully converted was dropped from the dataset.
Every tranch was downloaded, this is not the ~1B example ML subset from URL
The dataset was entirely shuffled then split into 80%/10%/10% splits for train/val/test.
A file URL is in the root of the reposity that contains all of the SELFIES tokens found in the data, with [START], [STOP], and [PAD] added. | [] | [
"TAGS\n#size_categories-1B<n<10B #license-mit #chemistry #biology #medical #arxiv-1905.13741 #region-us \n"
] | [
42
] | [
"passage: TAGS\n#size_categories-1B<n<10B #license-mit #chemistry #biology #medical #arxiv-1905.13741 #region-us \n"
] |
215118397fb5bafdd95b86d569fa0cc15cf9213b |
# Dataset Card for Evaluation run of BEE-spoke-data/Mixtral-GQA-400m-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BEE-spoke-data/Mixtral-GQA-400m-v2](https://huggingface.co/BEE-spoke-data/Mixtral-GQA-400m-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BEE-spoke-data__Mixtral-GQA-400m-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T18:12:16.481327](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__Mixtral-GQA-400m-v2/blob/main/results_2023-12-23T18-12-16.481327.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25928519994733173,
"acc_stderr": 0.03084201892972556,
"acc_norm": 0.26026390145872735,
"acc_norm_stderr": 0.03164977363535095,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156494,
"mc2": 0.465481151431113,
"mc2_stderr": 0.015410819143540583
},
"harness|arc:challenge|25": {
"acc": 0.17064846416382254,
"acc_stderr": 0.010993654168413738,
"acc_norm": 0.2022184300341297,
"acc_norm_stderr": 0.011737454431872104
},
"harness|hellaswag|10": {
"acc": 0.2698665604461263,
"acc_stderr": 0.004429831152914678,
"acc_norm": 0.2778331009759012,
"acc_norm_stderr": 0.004470152081675125
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.03279000406310053,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.03279000406310053
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.025288394502891363,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.025288394502891363
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198813,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198813
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.029241883869628806,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.029241883869628806
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302052,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302052
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036847,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036847
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0317852971064275,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0317852971064275
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403325,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403325
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3282828282828283,
"acc_stderr": 0.03345678422756777,
"acc_norm": 0.3282828282828283,
"acc_norm_stderr": 0.03345678422756777
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27979274611398963,
"acc_stderr": 0.03239637046735703,
"acc_norm": 0.27979274611398963,
"acc_norm_stderr": 0.03239637046735703
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3,
"acc_stderr": 0.023234581088428494,
"acc_norm": 0.3,
"acc_norm_stderr": 0.023234581088428494
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3445378151260504,
"acc_stderr": 0.030868682604121633,
"acc_norm": 0.3445378151260504,
"acc_norm_stderr": 0.030868682604121633
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24036697247706423,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.24036697247706423,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501936,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501936
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3452914798206278,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.3452914798206278,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2771392081736909,
"acc_stderr": 0.016005636294122418,
"acc_norm": 0.2771392081736909,
"acc_norm_stderr": 0.016005636294122418
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.022989592543123563,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.022989592543123563
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.024954184324879905,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.024954184324879905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24758842443729903,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.24758842443729903,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.0258921511567094,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.0258921511567094
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24445893089960888,
"acc_stderr": 0.010976425013113902,
"acc_norm": 0.24445893089960888,
"acc_norm_stderr": 0.010976425013113902
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.016992723465466222,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.016992723465466222
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20816326530612245,
"acc_stderr": 0.025991117672813296,
"acc_norm": 0.20816326530612245,
"acc_norm_stderr": 0.025991117672813296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22388059701492538,
"acc_stderr": 0.029475250236017183,
"acc_norm": 0.22388059701492538,
"acc_norm_stderr": 0.029475250236017183
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.03175554786629921,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.03175554786629921
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156494,
"mc2": 0.465481151431113,
"mc2_stderr": 0.015410819143540583
},
"harness|winogrande|5": {
"acc": 0.4996053670086819,
"acc_stderr": 0.014052481306049516
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225397
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BEE-spoke-data__Mixtral-GQA-400m-v2 | [
"region:us"
] | 2023-12-23T18:14:35+00:00 | {"pretty_name": "Evaluation run of BEE-spoke-data/Mixtral-GQA-400m-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [BEE-spoke-data/Mixtral-GQA-400m-v2](https://huggingface.co/BEE-spoke-data/Mixtral-GQA-400m-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BEE-spoke-data__Mixtral-GQA-400m-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T18:12:16.481327](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__Mixtral-GQA-400m-v2/blob/main/results_2023-12-23T18-12-16.481327.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25928519994733173,\n \"acc_stderr\": 0.03084201892972556,\n \"acc_norm\": 0.26026390145872735,\n \"acc_norm_stderr\": 0.03164977363535095,\n \"mc1\": 0.25458996328029376,\n \"mc1_stderr\": 0.015250117079156494,\n \"mc2\": 0.465481151431113,\n \"mc2_stderr\": 0.015410819143540583\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.17064846416382254,\n \"acc_stderr\": 0.010993654168413738,\n \"acc_norm\": 0.2022184300341297,\n \"acc_norm_stderr\": 0.011737454431872104\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2698665604461263,\n \"acc_stderr\": 0.004429831152914678,\n \"acc_norm\": 0.2778331009759012,\n \"acc_norm_stderr\": 0.004470152081675125\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.03279000406310053,\n \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.03279000406310053\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.025288394502891363,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.025288394502891363\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n \"acc_stderr\": 0.030631145539198813,\n \"acc_norm\": 0.2023121387283237,\n \"acc_norm_stderr\": 0.030631145539198813\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.029241883869628806,\n \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.029241883869628806\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n \"acc_stderr\": 0.03512207412302052,\n \"acc_norm\": 0.19047619047619047,\n \"acc_norm_stderr\": 0.03512207412302052\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036847,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036847\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0317852971064275,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0317852971064275\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403325,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403325\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03477691162163659,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03477691162163659\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3282828282828283,\n \"acc_stderr\": 0.03345678422756777,\n \"acc_norm\": 0.3282828282828283,\n \"acc_norm_stderr\": 0.03345678422756777\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.27979274611398963,\n \"acc_stderr\": 0.03239637046735703,\n \"acc_norm\": 0.27979274611398963,\n \"acc_norm_stderr\": 0.03239637046735703\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.023234581088428494,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.023234581088428494\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.030868682604121633,\n \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.030868682604121633\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24036697247706423,\n \"acc_stderr\": 0.01832060732096407,\n \"acc_norm\": 0.24036697247706423,\n \"acc_norm_stderr\": 0.01832060732096407\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501936,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501936\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3452914798206278,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.3452914798206278,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.038968789850704164,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.038968789850704164\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2771392081736909,\n \"acc_stderr\": 0.016005636294122418,\n \"acc_norm\": 0.2771392081736909,\n \"acc_norm_stderr\": 0.016005636294122418\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.022989592543123563,\n \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.022989592543123563\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.024954184324879905,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.024954184324879905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24758842443729903,\n \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.24758842443729903,\n \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.024922001168886335,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.024922001168886335\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25177304964539005,\n \"acc_stderr\": 0.0258921511567094,\n \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.0258921511567094\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24445893089960888,\n \"acc_stderr\": 0.010976425013113902,\n \"acc_norm\": 0.24445893089960888,\n \"acc_norm_stderr\": 0.010976425013113902\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.016992723465466222,\n \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.016992723465466222\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.20816326530612245,\n \"acc_stderr\": 0.025991117672813296,\n \"acc_norm\": 0.20816326530612245,\n \"acc_norm_stderr\": 0.025991117672813296\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22388059701492538,\n \"acc_stderr\": 0.029475250236017183,\n \"acc_norm\": 0.22388059701492538,\n \"acc_norm_stderr\": 0.029475250236017183\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n \"acc_stderr\": 0.03175554786629921,\n \"acc_norm\": 0.21084337349397592,\n \"acc_norm_stderr\": 0.03175554786629921\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n \"mc1_stderr\": 0.015250117079156494,\n \"mc2\": 0.465481151431113,\n \"mc2_stderr\": 0.015410819143540583\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4996053670086819,\n \"acc_stderr\": 0.014052481306049516\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.0007581501137225397\n }\n}\n```", "repo_url": "https://huggingface.co/BEE-spoke-data/Mixtral-GQA-400m-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|arc:challenge|25_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|gsm8k|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hellaswag|10_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T18-12-16.481327.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["**/details_harness|winogrande|5_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T18-12-16.481327.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T18_12_16.481327", "path": ["results_2023-12-23T18-12-16.481327.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T18-12-16.481327.parquet"]}]}]} | 2023-12-23T18:14:57+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BEE-spoke-data/Mixtral-GQA-400m-v2
Dataset automatically created during the evaluation run of model BEE-spoke-data/Mixtral-GQA-400m-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T18:12:16.481327(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BEE-spoke-data/Mixtral-GQA-400m-v2\n\n\n\nDataset automatically created during the evaluation run of model BEE-spoke-data/Mixtral-GQA-400m-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T18:12:16.481327(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BEE-spoke-data/Mixtral-GQA-400m-v2\n\n\n\nDataset automatically created during the evaluation run of model BEE-spoke-data/Mixtral-GQA-400m-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T18:12:16.481327(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
199,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of BEE-spoke-data/Mixtral-GQA-400m-v2\n\n\n\nDataset automatically created during the evaluation run of model BEE-spoke-data/Mixtral-GQA-400m-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T18:12:16.481327(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
fa10eaf96c19311517a55fa7a8a84e064b199ae0 | # Dataset Card for "quesiti-universitari"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mii-llm/quesiti-universitari | [
"region:us"
] | 2023-12-23T18:16:32+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5021246, "num_examples": 2700}], "download_size": 2770346, "dataset_size": 5021246}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-23T18:16:36+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "quesiti-universitari"
More Information needed | [
"# Dataset Card for \"quesiti-universitari\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"quesiti-universitari\"\n\nMore Information needed"
] | [
6,
15
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"quesiti-universitari\"\n\nMore Information needed"
] |
312c939b0cf0afd4fe04179aa3802f42bb99dc49 |
# Dataset Card for Evaluation run of Mihaiii/Metis-0.4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Mihaiii/Metis-0.4](https://huggingface.co/Mihaiii/Metis-0.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mihaiii__Metis-0.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T18:14:22.167641](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Metis-0.4/blob/main/results_2023-12-23T18-14-22.167641.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6225958370835007,
"acc_stderr": 0.032727358082431525,
"acc_norm": 0.6305379226640814,
"acc_norm_stderr": 0.03342131270283292,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.5920120177050053,
"mc2_stderr": 0.01556995067121447
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225403,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.014163366896192601
},
"harness|hellaswag|10": {
"acc": 0.6550487950607449,
"acc_stderr": 0.004743808792037864,
"acc_norm": 0.8390758812985462,
"acc_norm_stderr": 0.003667099594023357
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03782728980865469,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03782728980865469
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.024870815251057093,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.024870815251057093
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267025,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267025
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130952,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130952
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.017208579357787582,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.017208579357787582
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.03404705328653881,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.03404705328653881
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251745,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251745
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7943805874840357,
"acc_stderr": 0.01445250045678583,
"acc_norm": 0.7943805874840357,
"acc_norm_stderr": 0.01445250045678583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38324022346368714,
"acc_stderr": 0.016260159604429128,
"acc_norm": 0.38324022346368714,
"acc_norm_stderr": 0.016260159604429128
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464482,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464482
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291474,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4445893089960887,
"acc_stderr": 0.012691575792657114,
"acc_norm": 0.4445893089960887,
"acc_norm_stderr": 0.012691575792657114
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.019353360547553704,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.019353360547553704
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417465,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.5920120177050053,
"mc2_stderr": 0.01556995067121447
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698352
},
"harness|gsm8k|5": {
"acc": 0.2221379833206975,
"acc_stderr": 0.011449986902435323
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Mihaiii__Metis-0.4 | [
"region:us"
] | 2023-12-23T18:16:38+00:00 | {"pretty_name": "Evaluation run of Mihaiii/Metis-0.4", "dataset_summary": "Dataset automatically created during the evaluation run of model [Mihaiii/Metis-0.4](https://huggingface.co/Mihaiii/Metis-0.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mihaiii__Metis-0.4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T18:14:22.167641](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Metis-0.4/blob/main/results_2023-12-23T18-14-22.167641.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6225958370835007,\n \"acc_stderr\": 0.032727358082431525,\n \"acc_norm\": 0.6305379226640814,\n \"acc_norm_stderr\": 0.03342131270283292,\n \"mc1\": 0.42962056303549573,\n \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.5920120177050053,\n \"mc2_stderr\": 0.01556995067121447\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225403,\n \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.014163366896192601\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6550487950607449,\n \"acc_stderr\": 0.004743808792037864,\n \"acc_norm\": 0.8390758812985462,\n \"acc_norm_stderr\": 0.003667099594023357\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.024870815251057093,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.024870815251057093\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n \"acc_stderr\": 0.024251071262208837,\n \"acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.024251071262208837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267025,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267025\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130952,\n \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130952\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7981651376146789,\n \"acc_stderr\": 0.017208579357787582,\n \"acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.017208579357787582\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.03404705328653881,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.03404705328653881\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251745,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251745\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7943805874840357,\n \"acc_stderr\": 0.01445250045678583,\n \"acc_norm\": 0.7943805874840357,\n \"acc_norm_stderr\": 0.01445250045678583\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38324022346368714,\n \"acc_stderr\": 0.016260159604429128,\n \"acc_norm\": 0.38324022346368714,\n \"acc_norm_stderr\": 0.016260159604429128\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464482,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464482\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889016,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889016\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291474,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291474\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4445893089960887,\n \"acc_stderr\": 0.012691575792657114,\n \"acc_norm\": 0.4445893089960887,\n \"acc_norm_stderr\": 0.012691575792657114\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6454248366013072,\n \"acc_stderr\": 0.019353360547553704,\n \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.019353360547553704\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417465,\n \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.5920120177050053,\n \"mc2_stderr\": 0.01556995067121447\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698352\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2221379833206975,\n \"acc_stderr\": 0.011449986902435323\n }\n}\n```", "repo_url": "https://huggingface.co/Mihaiii/Metis-0.4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|arc:challenge|25_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|gsm8k|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hellaswag|10_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T18-14-22.167641.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["**/details_harness|winogrande|5_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T18-14-22.167641.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T18_14_22.167641", "path": ["results_2023-12-23T18-14-22.167641.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T18-14-22.167641.parquet"]}]}]} | 2023-12-23T18:16:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Mihaiii/Metis-0.4
Dataset automatically created during the evaluation run of model Mihaiii/Metis-0.4 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T18:14:22.167641(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Mihaiii/Metis-0.4\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Metis-0.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T18:14:22.167641(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Mihaiii/Metis-0.4\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Metis-0.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T18:14:22.167641(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
175,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Mihaiii/Metis-0.4\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Metis-0.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T18:14:22.167641(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
21018ec78410e57a81b957d97bd30b1abcd9e620 |
# Dataset Card for Evaluation run of APMIC/caigun-lora-model-34B-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [APMIC/caigun-lora-model-34B-v3](https://huggingface.co/APMIC/caigun-lora-model-34B-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_APMIC__caigun-lora-model-34B-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T18:31:31.662412](https://huggingface.co/datasets/open-llm-leaderboard/details_APMIC__caigun-lora-model-34B-v3/blob/main/results_2023-12-23T18-31-31.662412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7483676764475267,
"acc_stderr": 0.028542757366816985,
"acc_norm": 0.754246296187874,
"acc_norm_stderr": 0.029076189504433473,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5646830588696623,
"mc2_stderr": 0.014942367992340418
},
"harness|arc:challenge|25": {
"acc": 0.6356655290102389,
"acc_stderr": 0.014063260279882419,
"acc_norm": 0.6689419795221843,
"acc_norm_stderr": 0.013752062419817829
},
"harness|hellaswag|10": {
"acc": 0.6499701254730134,
"acc_stderr": 0.004760041843651487,
"acc_norm": 0.8477394941246763,
"acc_norm_stderr": 0.003585389636472376
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.0391545063041425,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.0391545063041425
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8881578947368421,
"acc_stderr": 0.02564834125169361,
"acc_norm": 0.8881578947368421,
"acc_norm_stderr": 0.02564834125169361
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7924528301886793,
"acc_stderr": 0.02495991802891127,
"acc_norm": 0.7924528301886793,
"acc_norm_stderr": 0.02495991802891127
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.028919802956134902,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.028919802956134902
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7489361702127659,
"acc_stderr": 0.028346963777162452,
"acc_norm": 0.7489361702127659,
"acc_norm_stderr": 0.028346963777162452
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7793103448275862,
"acc_stderr": 0.03455930201924812,
"acc_norm": 0.7793103448275862,
"acc_norm_stderr": 0.03455930201924812
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6375661375661376,
"acc_stderr": 0.02475747390275205,
"acc_norm": 0.6375661375661376,
"acc_norm_stderr": 0.02475747390275205
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8870967741935484,
"acc_stderr": 0.018003603325863593,
"acc_norm": 0.8870967741935484,
"acc_norm_stderr": 0.018003603325863593
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6157635467980296,
"acc_stderr": 0.034223985656575494,
"acc_norm": 0.6157635467980296,
"acc_norm_stderr": 0.034223985656575494
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8727272727272727,
"acc_stderr": 0.026024657651656187,
"acc_norm": 0.8727272727272727,
"acc_norm_stderr": 0.026024657651656187
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9191919191919192,
"acc_stderr": 0.019417681889724536,
"acc_norm": 0.9191919191919192,
"acc_norm_stderr": 0.019417681889724536
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527034,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527034
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7974358974358975,
"acc_stderr": 0.0203776609703714,
"acc_norm": 0.7974358974358975,
"acc_norm_stderr": 0.0203776609703714
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.029502861128955286,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.029502861128955286
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.865546218487395,
"acc_stderr": 0.02215937307274444,
"acc_norm": 0.865546218487395,
"acc_norm_stderr": 0.02215937307274444
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5165562913907285,
"acc_stderr": 0.04080244185628972,
"acc_norm": 0.5165562913907285,
"acc_norm_stderr": 0.04080244185628972
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9192660550458716,
"acc_stderr": 0.011680172292862086,
"acc_norm": 0.9192660550458716,
"acc_norm_stderr": 0.011680172292862086
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03214952147802749,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03214952147802749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316952,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316952
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7847533632286996,
"acc_stderr": 0.027584066602208274,
"acc_norm": 0.7847533632286996,
"acc_norm_stderr": 0.027584066602208274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.03088466108951538,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.03088466108951538
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.027285246312758957,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.027285246312758957
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.03145703854306251,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.03145703854306251
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.026321383198783674,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.026321383198783674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5892857142857143,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.5892857142857143,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.9029126213592233,
"acc_stderr": 0.02931596291881347,
"acc_norm": 0.9029126213592233,
"acc_norm_stderr": 0.02931596291881347
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9230769230769231,
"acc_stderr": 0.017456987872436186,
"acc_norm": 0.9230769230769231,
"acc_norm_stderr": 0.017456987872436186
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9080459770114943,
"acc_stderr": 0.010333225570778518,
"acc_norm": 0.9080459770114943,
"acc_norm_stderr": 0.010333225570778518
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8208092485549133,
"acc_stderr": 0.020647590029679332,
"acc_norm": 0.8208092485549133,
"acc_norm_stderr": 0.020647590029679332
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6011173184357542,
"acc_stderr": 0.01637696614261008,
"acc_norm": 0.6011173184357542,
"acc_norm_stderr": 0.01637696614261008
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.021339479988816027,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.021339479988816027
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.819935691318328,
"acc_stderr": 0.021823422857744943,
"acc_norm": 0.819935691318328,
"acc_norm_stderr": 0.021823422857744943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.01924252622654454,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.01924252622654454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6276595744680851,
"acc_stderr": 0.028838921471251458,
"acc_norm": 0.6276595744680851,
"acc_norm_stderr": 0.028838921471251458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5990873533246415,
"acc_stderr": 0.012516960350640814,
"acc_norm": 0.5990873533246415,
"acc_norm_stderr": 0.012516960350640814
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8051470588235294,
"acc_stderr": 0.024060599423487424,
"acc_norm": 0.8051470588235294,
"acc_norm_stderr": 0.024060599423487424
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8316993464052288,
"acc_stderr": 0.015135803338693367,
"acc_norm": 0.8316993464052288,
"acc_norm_stderr": 0.015135803338693367
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8285714285714286,
"acc_stderr": 0.024127463462650156,
"acc_norm": 0.8285714285714286,
"acc_norm_stderr": 0.024127463462650156
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.02353755765789256,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.02353755765789256
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5646830588696623,
"mc2_stderr": 0.014942367992340418
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.0104108497752228
},
"harness|gsm8k|5": {
"acc": 0.5451099317664898,
"acc_stderr": 0.013716318771794602
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_APMIC__caigun-lora-model-34B-v3 | [
"region:us"
] | 2023-12-23T18:33:40+00:00 | {"pretty_name": "Evaluation run of APMIC/caigun-lora-model-34B-v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [APMIC/caigun-lora-model-34B-v3](https://huggingface.co/APMIC/caigun-lora-model-34B-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_APMIC__caigun-lora-model-34B-v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T18:31:31.662412](https://huggingface.co/datasets/open-llm-leaderboard/details_APMIC__caigun-lora-model-34B-v3/blob/main/results_2023-12-23T18-31-31.662412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7483676764475267,\n \"acc_stderr\": 0.028542757366816985,\n \"acc_norm\": 0.754246296187874,\n \"acc_norm_stderr\": 0.029076189504433473,\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5646830588696623,\n \"mc2_stderr\": 0.014942367992340418\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6356655290102389,\n \"acc_stderr\": 0.014063260279882419,\n \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.013752062419817829\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6499701254730134,\n \"acc_stderr\": 0.004760041843651487,\n \"acc_norm\": 0.8477394941246763,\n \"acc_norm_stderr\": 0.003585389636472376\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n \"acc_stderr\": 0.0391545063041425,\n \"acc_norm\": 0.7111111111111111,\n \"acc_norm_stderr\": 0.0391545063041425\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8881578947368421,\n \"acc_stderr\": 0.02564834125169361,\n \"acc_norm\": 0.8881578947368421,\n \"acc_norm_stderr\": 0.02564834125169361\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7924528301886793,\n \"acc_stderr\": 0.02495991802891127,\n \"acc_norm\": 0.7924528301886793,\n \"acc_norm_stderr\": 0.02495991802891127\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.028919802956134902,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.028919802956134902\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7489361702127659,\n \"acc_stderr\": 0.028346963777162452,\n \"acc_norm\": 0.7489361702127659,\n \"acc_norm_stderr\": 0.028346963777162452\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7793103448275862,\n \"acc_stderr\": 0.03455930201924812,\n \"acc_norm\": 0.7793103448275862,\n \"acc_norm_stderr\": 0.03455930201924812\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6375661375661376,\n \"acc_stderr\": 0.02475747390275205,\n \"acc_norm\": 0.6375661375661376,\n \"acc_norm_stderr\": 0.02475747390275205\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8870967741935484,\n \"acc_stderr\": 0.018003603325863593,\n \"acc_norm\": 0.8870967741935484,\n \"acc_norm_stderr\": 0.018003603325863593\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6157635467980296,\n \"acc_stderr\": 0.034223985656575494,\n \"acc_norm\": 0.6157635467980296,\n \"acc_norm_stderr\": 0.034223985656575494\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.026024657651656187,\n \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.026024657651656187\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536,\n \"acc_norm\": 0.9191919191919192,\n \"acc_norm_stderr\": 0.019417681889724536\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527034,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527034\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7974358974358975,\n \"acc_stderr\": 0.0203776609703714,\n \"acc_norm\": 0.7974358974358975,\n \"acc_norm_stderr\": 0.0203776609703714\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37407407407407406,\n \"acc_stderr\": 0.029502861128955286,\n \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.029502861128955286\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.865546218487395,\n \"acc_stderr\": 0.02215937307274444,\n \"acc_norm\": 0.865546218487395,\n \"acc_norm_stderr\": 0.02215937307274444\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5165562913907285,\n \"acc_stderr\": 0.04080244185628972,\n \"acc_norm\": 0.5165562913907285,\n \"acc_norm_stderr\": 0.04080244185628972\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9192660550458716,\n \"acc_stderr\": 0.011680172292862086,\n \"acc_norm\": 0.9192660550458716,\n \"acc_norm_stderr\": 0.011680172292862086\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03214952147802749,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03214952147802749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316952,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316952\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n \"acc_stderr\": 0.027584066602208274,\n \"acc_norm\": 0.7847533632286996,\n \"acc_norm_stderr\": 0.027584066602208274\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.03088466108951538,\n \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.03088466108951538\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.03145703854306251,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.03145703854306251\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.026321383198783674,\n \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.026321383198783674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5892857142857143,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.5892857142857143,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9029126213592233,\n \"acc_stderr\": 0.02931596291881347,\n \"acc_norm\": 0.9029126213592233,\n \"acc_norm_stderr\": 0.02931596291881347\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n \"acc_stderr\": 0.017456987872436186,\n \"acc_norm\": 0.9230769230769231,\n \"acc_norm_stderr\": 0.017456987872436186\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9080459770114943,\n \"acc_stderr\": 0.010333225570778518,\n \"acc_norm\": 0.9080459770114943,\n \"acc_norm_stderr\": 0.010333225570778518\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6011173184357542,\n \"acc_stderr\": 0.01637696614261008,\n \"acc_norm\": 0.6011173184357542,\n \"acc_norm_stderr\": 0.01637696614261008\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.021339479988816027,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.021339479988816027\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.819935691318328,\n \"acc_stderr\": 0.021823422857744943,\n \"acc_norm\": 0.819935691318328,\n \"acc_norm_stderr\": 0.021823422857744943\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.01924252622654454,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.01924252622654454\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6276595744680851,\n \"acc_stderr\": 0.028838921471251458,\n \"acc_norm\": 0.6276595744680851,\n \"acc_norm_stderr\": 0.028838921471251458\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5990873533246415,\n \"acc_stderr\": 0.012516960350640814,\n \"acc_norm\": 0.5990873533246415,\n \"acc_norm_stderr\": 0.012516960350640814\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8051470588235294,\n \"acc_stderr\": 0.024060599423487424,\n \"acc_norm\": 0.8051470588235294,\n \"acc_norm_stderr\": 0.024060599423487424\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8316993464052288,\n \"acc_stderr\": 0.015135803338693367,\n \"acc_norm\": 0.8316993464052288,\n \"acc_norm_stderr\": 0.015135803338693367\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.024127463462650156,\n \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.024127463462650156\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.02353755765789256,\n \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.02353755765789256\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5646830588696623,\n \"mc2_stderr\": 0.014942367992340418\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.0104108497752228\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5451099317664898,\n \"acc_stderr\": 0.013716318771794602\n }\n}\n```", "repo_url": "https://huggingface.co/APMIC/caigun-lora-model-34B-v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|arc:challenge|25_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|gsm8k|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hellaswag|10_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T18-31-31.662412.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["**/details_harness|winogrande|5_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T18-31-31.662412.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T18_31_31.662412", "path": ["results_2023-12-23T18-31-31.662412.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T18-31-31.662412.parquet"]}]}]} | 2023-12-23T18:34:03+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of APMIC/caigun-lora-model-34B-v3
Dataset automatically created during the evaluation run of model APMIC/caigun-lora-model-34B-v3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T18:31:31.662412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of APMIC/caigun-lora-model-34B-v3\n\n\n\nDataset automatically created during the evaluation run of model APMIC/caigun-lora-model-34B-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T18:31:31.662412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of APMIC/caigun-lora-model-34B-v3\n\n\n\nDataset automatically created during the evaluation run of model APMIC/caigun-lora-model-34B-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T18:31:31.662412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of APMIC/caigun-lora-model-34B-v3\n\n\n\nDataset automatically created during the evaluation run of model APMIC/caigun-lora-model-34B-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T18:31:31.662412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
f10da38136a05618c3e362d068fcc64e11c8a684 |
# Dataset Card for Evaluation run of APMIC/caigun-lora-model-34B-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [APMIC/caigun-lora-model-34B-v2](https://huggingface.co/APMIC/caigun-lora-model-34B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_APMIC__caigun-lora-model-34B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T18:32:48.533990](https://huggingface.co/datasets/open-llm-leaderboard/details_APMIC__caigun-lora-model-34B-v2/blob/main/results_2023-12-23T18-32-48.533990.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7514714868021417,
"acc_stderr": 0.028416591943209543,
"acc_norm": 0.7567219646228759,
"acc_norm_stderr": 0.028946697394984712,
"mc1": 0.42472460220318237,
"mc1_stderr": 0.01730400095716748,
"mc2": 0.5802743849781816,
"mc2_stderr": 0.014915488598308874
},
"harness|arc:challenge|25": {
"acc": 0.6117747440273038,
"acc_stderr": 0.014241614207414044,
"acc_norm": 0.6501706484641638,
"acc_norm_stderr": 0.013936809212158294
},
"harness|hellaswag|10": {
"acc": 0.6528579964150567,
"acc_stderr": 0.004750884401095161,
"acc_norm": 0.8528181637124079,
"acc_norm_stderr": 0.0035356302890914605
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.03885004245800253,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.03885004245800253
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474924,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8037735849056604,
"acc_stderr": 0.024442388131100813,
"acc_norm": 0.8037735849056604,
"acc_norm_stderr": 0.024442388131100813
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.875,
"acc_stderr": 0.02765610492929436,
"acc_norm": 0.875,
"acc_norm_stderr": 0.02765610492929436
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7617021276595745,
"acc_stderr": 0.027851252973889778,
"acc_norm": 0.7617021276595745,
"acc_norm_stderr": 0.027851252973889778
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5701754385964912,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.5701754385964912,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6216931216931217,
"acc_stderr": 0.02497695405315524,
"acc_norm": 0.6216931216931217,
"acc_norm_stderr": 0.02497695405315524
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9,
"acc_stderr": 0.017066403719657244,
"acc_norm": 0.9,
"acc_norm_stderr": 0.017066403719657244
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6502463054187192,
"acc_stderr": 0.03355400904969567,
"acc_norm": 0.6502463054187192,
"acc_norm_stderr": 0.03355400904969567
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8727272727272727,
"acc_stderr": 0.026024657651656194,
"acc_norm": 0.8727272727272727,
"acc_norm_stderr": 0.026024657651656194
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.02048208677542421,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.02048208677542421
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909039,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909039
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7871794871794872,
"acc_stderr": 0.020752423722128013,
"acc_norm": 0.7871794871794872,
"acc_norm_stderr": 0.020752423722128013
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465715,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465715
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8571428571428571,
"acc_stderr": 0.02273020811930654,
"acc_norm": 0.8571428571428571,
"acc_norm_stderr": 0.02273020811930654
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5298013245033113,
"acc_stderr": 0.04075224992216979,
"acc_norm": 0.5298013245033113,
"acc_norm_stderr": 0.04075224992216979
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9192660550458716,
"acc_stderr": 0.011680172292862086,
"acc_norm": 0.9192660550458716,
"acc_norm_stderr": 0.011680172292862086
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6620370370370371,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.6620370370370371,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.02039785396942699,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.02039785396942699
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095672,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095672
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.027285246312758957,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.027285246312758957
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8957055214723927,
"acc_stderr": 0.02401351731943907,
"acc_norm": 0.8957055214723927,
"acc_norm_stderr": 0.02401351731943907
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6339285714285714,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.6339285714285714,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.9029126213592233,
"acc_stderr": 0.02931596291881348,
"acc_norm": 0.9029126213592233,
"acc_norm_stderr": 0.02931596291881348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.01872430174194166,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.01872430174194166
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8952745849297573,
"acc_stderr": 0.010949664098633361,
"acc_norm": 0.8952745849297573,
"acc_norm_stderr": 0.010949664098633361
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.815028901734104,
"acc_stderr": 0.020903975842083027,
"acc_norm": 0.815028901734104,
"acc_norm_stderr": 0.020903975842083027
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6067039106145251,
"acc_stderr": 0.01633726869427011,
"acc_norm": 0.6067039106145251,
"acc_norm_stderr": 0.01633726869427011
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8366013071895425,
"acc_stderr": 0.021170623011213516,
"acc_norm": 0.8366013071895425,
"acc_norm_stderr": 0.021170623011213516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8231511254019293,
"acc_stderr": 0.02167005888551079,
"acc_norm": 0.8231511254019293,
"acc_norm_stderr": 0.02167005888551079
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.01810541409432967,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.01810541409432967
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6382978723404256,
"acc_stderr": 0.02866382014719949,
"acc_norm": 0.6382978723404256,
"acc_norm_stderr": 0.02866382014719949
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5977835723598436,
"acc_stderr": 0.012523646856180178,
"acc_norm": 0.5977835723598436,
"acc_norm_stderr": 0.012523646856180178
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8014705882352942,
"acc_stderr": 0.024231013370541087,
"acc_norm": 0.8014705882352942,
"acc_norm_stderr": 0.024231013370541087
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.015422512066262552,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.015422512066262552
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8408163265306122,
"acc_stderr": 0.023420972069166344,
"acc_norm": 0.8408163265306122,
"acc_norm_stderr": 0.023420972069166344
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646613,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646613
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276908,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276908
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42472460220318237,
"mc1_stderr": 0.01730400095716748,
"mc2": 0.5802743849781816,
"mc2_stderr": 0.014915488598308874
},
"harness|winogrande|5": {
"acc": 0.8303078137332282,
"acc_stderr": 0.010549542647363696
},
"harness|gsm8k|5": {
"acc": 0.601213040181956,
"acc_stderr": 0.013487360477060837
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_APMIC__caigun-lora-model-34B-v2 | [
"region:us"
] | 2023-12-23T18:34:59+00:00 | {"pretty_name": "Evaluation run of APMIC/caigun-lora-model-34B-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [APMIC/caigun-lora-model-34B-v2](https://huggingface.co/APMIC/caigun-lora-model-34B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_APMIC__caigun-lora-model-34B-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T18:32:48.533990](https://huggingface.co/datasets/open-llm-leaderboard/details_APMIC__caigun-lora-model-34B-v2/blob/main/results_2023-12-23T18-32-48.533990.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7514714868021417,\n \"acc_stderr\": 0.028416591943209543,\n \"acc_norm\": 0.7567219646228759,\n \"acc_norm_stderr\": 0.028946697394984712,\n \"mc1\": 0.42472460220318237,\n \"mc1_stderr\": 0.01730400095716748,\n \"mc2\": 0.5802743849781816,\n \"mc2_stderr\": 0.014915488598308874\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6117747440273038,\n \"acc_stderr\": 0.014241614207414044,\n \"acc_norm\": 0.6501706484641638,\n \"acc_norm_stderr\": 0.013936809212158294\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6528579964150567,\n \"acc_stderr\": 0.004750884401095161,\n \"acc_norm\": 0.8528181637124079,\n \"acc_norm_stderr\": 0.0035356302890914605\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n \"acc_stderr\": 0.03885004245800253,\n \"acc_norm\": 0.7185185185185186,\n \"acc_norm_stderr\": 0.03885004245800253\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474924,\n \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474924\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8037735849056604,\n \"acc_stderr\": 0.024442388131100813,\n \"acc_norm\": 0.8037735849056604,\n \"acc_norm_stderr\": 0.024442388131100813\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.02765610492929436,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.02765610492929436\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7617021276595745,\n \"acc_stderr\": 0.027851252973889778,\n \"acc_norm\": 0.7617021276595745,\n \"acc_norm_stderr\": 0.027851252973889778\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5701754385964912,\n \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.5701754385964912,\n \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.034165204477475494,\n \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.034165204477475494\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6216931216931217,\n \"acc_stderr\": 0.02497695405315524,\n \"acc_norm\": 0.6216931216931217,\n \"acc_norm_stderr\": 0.02497695405315524\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.017066403719657244,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.017066403719657244\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6502463054187192,\n \"acc_stderr\": 0.03355400904969567,\n \"acc_norm\": 0.6502463054187192,\n \"acc_norm_stderr\": 0.03355400904969567\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.026024657651656194,\n \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.026024657651656194\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.02048208677542421,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.02048208677542421\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909039,\n \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909039\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7871794871794872,\n \"acc_stderr\": 0.020752423722128013,\n \"acc_norm\": 0.7871794871794872,\n \"acc_norm_stderr\": 0.020752423722128013\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465715,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465715\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8571428571428571,\n \"acc_stderr\": 0.02273020811930654,\n \"acc_norm\": 0.8571428571428571,\n \"acc_norm_stderr\": 0.02273020811930654\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5298013245033113,\n \"acc_stderr\": 0.04075224992216979,\n \"acc_norm\": 0.5298013245033113,\n \"acc_norm_stderr\": 0.04075224992216979\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9192660550458716,\n \"acc_stderr\": 0.011680172292862086,\n \"acc_norm\": 0.9192660550458716,\n \"acc_norm_stderr\": 0.011680172292862086\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6620370370370371,\n \"acc_stderr\": 0.03225941352631295,\n \"acc_norm\": 0.6620370370370371,\n \"acc_norm_stderr\": 0.03225941352631295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9068627450980392,\n \"acc_stderr\": 0.02039785396942699,\n \"acc_norm\": 0.9068627450980392,\n \"acc_norm_stderr\": 0.02039785396942699\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095672,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095672\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8957055214723927,\n \"acc_stderr\": 0.02401351731943907,\n \"acc_norm\": 0.8957055214723927,\n \"acc_norm_stderr\": 0.02401351731943907\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.6339285714285714,\n \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9029126213592233,\n \"acc_stderr\": 0.02931596291881348,\n \"acc_norm\": 0.9029126213592233,\n \"acc_norm_stderr\": 0.02931596291881348\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n \"acc_stderr\": 0.01872430174194166,\n \"acc_norm\": 0.9102564102564102,\n \"acc_norm_stderr\": 0.01872430174194166\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8952745849297573,\n \"acc_stderr\": 0.010949664098633361,\n \"acc_norm\": 0.8952745849297573,\n \"acc_norm_stderr\": 0.010949664098633361\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.815028901734104,\n \"acc_stderr\": 0.020903975842083027,\n \"acc_norm\": 0.815028901734104,\n \"acc_norm_stderr\": 0.020903975842083027\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6067039106145251,\n \"acc_stderr\": 0.01633726869427011,\n \"acc_norm\": 0.6067039106145251,\n \"acc_norm_stderr\": 0.01633726869427011\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8366013071895425,\n \"acc_stderr\": 0.021170623011213516,\n \"acc_norm\": 0.8366013071895425,\n \"acc_norm_stderr\": 0.021170623011213516\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8231511254019293,\n \"acc_stderr\": 0.02167005888551079,\n \"acc_norm\": 0.8231511254019293,\n \"acc_norm_stderr\": 0.02167005888551079\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.01810541409432967,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.01810541409432967\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6382978723404256,\n \"acc_stderr\": 0.02866382014719949,\n \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.02866382014719949\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5977835723598436,\n \"acc_stderr\": 0.012523646856180178,\n \"acc_norm\": 0.5977835723598436,\n \"acc_norm_stderr\": 0.012523646856180178\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8014705882352942,\n \"acc_stderr\": 0.024231013370541087,\n \"acc_norm\": 0.8014705882352942,\n \"acc_norm_stderr\": 0.024231013370541087\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.015422512066262552,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.015422512066262552\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8408163265306122,\n \"acc_stderr\": 0.023420972069166344,\n \"acc_norm\": 0.8408163265306122,\n \"acc_norm_stderr\": 0.023420972069166344\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646613,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646613\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276908,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276908\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42472460220318237,\n \"mc1_stderr\": 0.01730400095716748,\n \"mc2\": 0.5802743849781816,\n \"mc2_stderr\": 0.014915488598308874\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8303078137332282,\n \"acc_stderr\": 0.010549542647363696\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.601213040181956,\n \"acc_stderr\": 0.013487360477060837\n }\n}\n```", "repo_url": "https://huggingface.co/APMIC/caigun-lora-model-34B-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|arc:challenge|25_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|gsm8k|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hellaswag|10_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T18-32-48.533990.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["**/details_harness|winogrande|5_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T18-32-48.533990.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T18_32_48.533990", "path": ["results_2023-12-23T18-32-48.533990.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T18-32-48.533990.parquet"]}]}]} | 2023-12-23T18:35:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of APMIC/caigun-lora-model-34B-v2
Dataset automatically created during the evaluation run of model APMIC/caigun-lora-model-34B-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T18:32:48.533990(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of APMIC/caigun-lora-model-34B-v2\n\n\n\nDataset automatically created during the evaluation run of model APMIC/caigun-lora-model-34B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T18:32:48.533990(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of APMIC/caigun-lora-model-34B-v2\n\n\n\nDataset automatically created during the evaluation run of model APMIC/caigun-lora-model-34B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T18:32:48.533990(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of APMIC/caigun-lora-model-34B-v2\n\n\n\nDataset automatically created during the evaluation run of model APMIC/caigun-lora-model-34B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T18:32:48.533990(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
5e75f8c390ab6cc84b9a0268d31cc61b4f8f68d4 |
# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx2_MoE_13B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/Mixtral_7Bx2_MoE_13B](https://huggingface.co/cloudyu/Mixtral_7Bx2_MoE_13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__Mixtral_7Bx2_MoE_13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T18:38:27.447286](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Mixtral_7Bx2_MoE_13B/blob/main/results_2023-12-23T18-38-27.447286.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6223215644024702,
"acc_stderr": 0.03272119140672753,
"acc_norm": 0.6268443929180944,
"acc_norm_stderr": 0.03337594702337273,
"mc1": 0.40514075887392903,
"mc1_stderr": 0.017185611727753368,
"mc2": 0.5754893244644783,
"mc2_stderr": 0.015550219619052192
},
"harness|arc:challenge|25": {
"acc": 0.6040955631399317,
"acc_stderr": 0.014291228393536588,
"acc_norm": 0.6484641638225256,
"acc_norm_stderr": 0.013952413699600933
},
"harness|hellaswag|10": {
"acc": 0.6388169687313284,
"acc_stderr": 0.0047936178356450705,
"acc_norm": 0.8391754630551683,
"acc_norm_stderr": 0.0036661823284423437
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952929,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952929
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432108,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432108
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6258064516129033,
"acc_stderr": 0.027528904299845704,
"acc_norm": 0.6258064516129033,
"acc_norm_stderr": 0.027528904299845704
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815635,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815635
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059288,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612893,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612893
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097652,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097652
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6791907514450867,
"acc_stderr": 0.025131000233647897,
"acc_norm": 0.6791907514450867,
"acc_norm_stderr": 0.025131000233647897
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42681564245810055,
"acc_stderr": 0.016542401954631917,
"acc_norm": 0.42681564245810055,
"acc_norm_stderr": 0.016542401954631917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757485,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757485
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603746,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45371577574967403,
"acc_stderr": 0.012715404841277738,
"acc_norm": 0.45371577574967403,
"acc_norm_stderr": 0.012715404841277738
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806318,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806318
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291286,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291286
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.572139303482587,
"acc_stderr": 0.03498541988407795,
"acc_norm": 0.572139303482587,
"acc_norm_stderr": 0.03498541988407795
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40514075887392903,
"mc1_stderr": 0.017185611727753368,
"mc2": 0.5754893244644783,
"mc2_stderr": 0.015550219619052192
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643409
},
"harness|gsm8k|5": {
"acc": 0.44351781652767247,
"acc_stderr": 0.013684327592606165
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cloudyu__Mixtral_7Bx2_MoE_13B | [
"region:us"
] | 2023-12-23T18:40:44+00:00 | {"pretty_name": "Evaluation run of cloudyu/Mixtral_7Bx2_MoE_13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [cloudyu/Mixtral_7Bx2_MoE_13B](https://huggingface.co/cloudyu/Mixtral_7Bx2_MoE_13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__Mixtral_7Bx2_MoE_13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T18:38:27.447286](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Mixtral_7Bx2_MoE_13B/blob/main/results_2023-12-23T18-38-27.447286.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6223215644024702,\n \"acc_stderr\": 0.03272119140672753,\n \"acc_norm\": 0.6268443929180944,\n \"acc_norm_stderr\": 0.03337594702337273,\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.017185611727753368,\n \"mc2\": 0.5754893244644783,\n \"mc2_stderr\": 0.015550219619052192\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6040955631399317,\n \"acc_stderr\": 0.014291228393536588,\n \"acc_norm\": 0.6484641638225256,\n \"acc_norm_stderr\": 0.013952413699600933\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6388169687313284,\n \"acc_stderr\": 0.0047936178356450705,\n \"acc_norm\": 0.8391754630551683,\n \"acc_norm_stderr\": 0.0036661823284423437\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952929,\n \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952929\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432108,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432108\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n \"acc_stderr\": 0.027528904299845704,\n \"acc_norm\": 0.6258064516129033,\n \"acc_norm_stderr\": 0.027528904299845704\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059288,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059288\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612893,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612893\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097652,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097652\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647897,\n \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647897\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42681564245810055,\n \"acc_stderr\": 0.016542401954631917,\n \"acc_norm\": 0.42681564245810055,\n \"acc_norm_stderr\": 0.016542401954631917\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757485,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757485\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603746,\n \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603746\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n \"acc_stderr\": 0.012715404841277738,\n \"acc_norm\": 0.45371577574967403,\n \"acc_norm_stderr\": 0.012715404841277738\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806318,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806318\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291286,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291286\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.572139303482587,\n \"acc_stderr\": 0.03498541988407795,\n \"acc_norm\": 0.572139303482587,\n \"acc_norm_stderr\": 0.03498541988407795\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.017185611727753368,\n \"mc2\": 0.5754893244644783,\n \"mc2_stderr\": 0.015550219619052192\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643409\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.44351781652767247,\n \"acc_stderr\": 0.013684327592606165\n }\n}\n```", "repo_url": "https://huggingface.co/cloudyu/Mixtral_7Bx2_MoE_13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|arc:challenge|25_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|gsm8k|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hellaswag|10_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T18-38-27.447286.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["**/details_harness|winogrande|5_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T18-38-27.447286.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T18_38_27.447286", "path": ["results_2023-12-23T18-38-27.447286.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T18-38-27.447286.parquet"]}]}]} | 2023-12-23T18:41:06+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx2_MoE_13B
Dataset automatically created during the evaluation run of model cloudyu/Mixtral_7Bx2_MoE_13B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T18:38:27.447286(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx2_MoE_13B\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Mixtral_7Bx2_MoE_13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T18:38:27.447286(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx2_MoE_13B\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Mixtral_7Bx2_MoE_13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T18:38:27.447286(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx2_MoE_13B\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Mixtral_7Bx2_MoE_13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T18:38:27.447286(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
d7808aa432720d62342f4cbd37cca7072d567f15 |
# Dataset Card for Evaluation run of KaeriJenti/Kaori-34B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KaeriJenti/Kaori-34B-v1](https://huggingface.co/KaeriJenti/Kaori-34B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KaeriJenti__Kaori-34B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T17:12:07.814018](https://huggingface.co/datasets/open-llm-leaderboard/details_KaeriJenti__Kaori-34B-v1/blob/main/results_2023-12-24T17-12-07.814018.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6944862394705882,
"acc_stderr": 0.030633297916500872,
"acc_norm": 0.7024973870801288,
"acc_norm_stderr": 0.031227562224760336,
"mc1": 0.38555691554467564,
"mc1_stderr": 0.017038839010591673,
"mc2": 0.5314232366697602,
"mc2_stderr": 0.01519863786002632
},
"harness|arc:challenge|25": {
"acc": 0.6151877133105802,
"acc_stderr": 0.014218371065251102,
"acc_norm": 0.6450511945392492,
"acc_norm_stderr": 0.013983036904094089
},
"harness|hellaswag|10": {
"acc": 0.6061541525592511,
"acc_stderr": 0.004876028037941935,
"acc_norm": 0.7964548894642501,
"acc_norm_stderr": 0.004018115765954251
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.031103182383123387,
"acc_norm": 0.8223684210526315,
"acc_norm_stderr": 0.031103182383123387
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7773584905660378,
"acc_stderr": 0.025604233470899098,
"acc_norm": 0.7773584905660378,
"acc_norm_stderr": 0.025604233470899098
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565666,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6978723404255319,
"acc_stderr": 0.03001755447188056,
"acc_norm": 0.6978723404255319,
"acc_norm_stderr": 0.03001755447188056
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7103448275862069,
"acc_stderr": 0.03780019230438014,
"acc_norm": 0.7103448275862069,
"acc_norm_stderr": 0.03780019230438014
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5476190476190477,
"acc_stderr": 0.02563425811555495,
"acc_norm": 0.5476190476190477,
"acc_norm_stderr": 0.02563425811555495
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.021417242936321586,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.021417242936321586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.02325315795194208,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.02325315795194208
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607565,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607565
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7076923076923077,
"acc_stderr": 0.023060438380857737,
"acc_norm": 0.7076923076923077,
"acc_norm_stderr": 0.023060438380857737
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8109243697478992,
"acc_stderr": 0.02543511943810536,
"acc_norm": 0.8109243697478992,
"acc_norm_stderr": 0.02543511943810536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8880733944954129,
"acc_stderr": 0.013517352714958792,
"acc_norm": 0.8880733944954129,
"acc_norm_stderr": 0.013517352714958792
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240634,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240634
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291954,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.032785485373431386,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.032785485373431386
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.017893784904018516,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.017893784904018516
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.012703598899445166,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.012703598899445166
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.664804469273743,
"acc_stderr": 0.015788007190185884,
"acc_norm": 0.664804469273743,
"acc_norm_stderr": 0.015788007190185884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7875816993464052,
"acc_stderr": 0.02342037547829613,
"acc_norm": 0.7875816993464052,
"acc_norm_stderr": 0.02342037547829613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.752411575562701,
"acc_stderr": 0.02451387997362197,
"acc_norm": 0.752411575562701,
"acc_norm_stderr": 0.02451387997362197
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0227797190887334,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0227797190887334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.599290780141844,
"acc_stderr": 0.029233465745573096,
"acc_norm": 0.599290780141844,
"acc_norm_stderr": 0.029233465745573096
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5280312907431551,
"acc_stderr": 0.012750151802922445,
"acc_norm": 0.5280312907431551,
"acc_norm_stderr": 0.012750151802922445
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.027576468622740536,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.027576468622740536
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7238562091503268,
"acc_stderr": 0.018087276935663133,
"acc_norm": 0.7238562091503268,
"acc_norm_stderr": 0.018087276935663133
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291275,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291275
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306042,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.026640582539133196,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.026640582539133196
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38555691554467564,
"mc1_stderr": 0.017038839010591673,
"mc2": 0.5314232366697602,
"mc2_stderr": 0.01519863786002632
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836673
},
"harness|gsm8k|5": {
"acc": 0.36694465504169826,
"acc_stderr": 0.013275883047712218
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_KaeriJenti__Kaori-34B-v1 | [
"region:us"
] | 2023-12-23T19:12:06+00:00 | {"pretty_name": "Evaluation run of KaeriJenti/Kaori-34B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [KaeriJenti/Kaori-34B-v1](https://huggingface.co/KaeriJenti/Kaori-34B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KaeriJenti__Kaori-34B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-24T17:12:07.814018](https://huggingface.co/datasets/open-llm-leaderboard/details_KaeriJenti__Kaori-34B-v1/blob/main/results_2023-12-24T17-12-07.814018.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6944862394705882,\n \"acc_stderr\": 0.030633297916500872,\n \"acc_norm\": 0.7024973870801288,\n \"acc_norm_stderr\": 0.031227562224760336,\n \"mc1\": 0.38555691554467564,\n \"mc1_stderr\": 0.017038839010591673,\n \"mc2\": 0.5314232366697602,\n \"mc2_stderr\": 0.01519863786002632\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6151877133105802,\n \"acc_stderr\": 0.014218371065251102,\n \"acc_norm\": 0.6450511945392492,\n \"acc_norm_stderr\": 0.013983036904094089\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6061541525592511,\n \"acc_stderr\": 0.004876028037941935,\n \"acc_norm\": 0.7964548894642501,\n \"acc_norm_stderr\": 0.004018115765954251\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.031103182383123387,\n \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.031103182383123387\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7773584905660378,\n \"acc_stderr\": 0.025604233470899098,\n \"acc_norm\": 0.7773584905660378,\n \"acc_norm_stderr\": 0.025604233470899098\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n \"acc_stderr\": 0.030085743248565666,\n \"acc_norm\": 0.8472222222222222,\n \"acc_norm_stderr\": 0.030085743248565666\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6978723404255319,\n \"acc_stderr\": 0.03001755447188056,\n \"acc_norm\": 0.6978723404255319,\n \"acc_norm_stderr\": 0.03001755447188056\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7103448275862069,\n \"acc_stderr\": 0.03780019230438014,\n \"acc_norm\": 0.7103448275862069,\n \"acc_norm_stderr\": 0.03780019230438014\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5476190476190477,\n \"acc_stderr\": 0.02563425811555495,\n \"acc_norm\": 0.5476190476190477,\n \"acc_norm_stderr\": 0.02563425811555495\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n \"acc_stderr\": 0.021417242936321586,\n \"acc_norm\": 0.8290322580645161,\n \"acc_norm_stderr\": 0.021417242936321586\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.02325315795194208,\n \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.02325315795194208\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607565,\n \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607565\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7076923076923077,\n \"acc_stderr\": 0.023060438380857737,\n \"acc_norm\": 0.7076923076923077,\n \"acc_norm_stderr\": 0.023060438380857737\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8109243697478992,\n \"acc_stderr\": 0.02543511943810536,\n \"acc_norm\": 0.8109243697478992,\n \"acc_norm_stderr\": 0.02543511943810536\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8880733944954129,\n \"acc_stderr\": 0.013517352714958792,\n \"acc_norm\": 0.8880733944954129,\n \"acc_norm_stderr\": 0.013517352714958792\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240634,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240634\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.030360379710291954,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.030360379710291954\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.032785485373431386,\n \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.032785485373431386\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n \"acc_stderr\": 0.017893784904018516,\n \"acc_norm\": 0.9188034188034188,\n \"acc_norm_stderr\": 0.017893784904018516\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.012703598899445166,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.012703598899445166\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.664804469273743,\n \"acc_stderr\": 0.015788007190185884,\n \"acc_norm\": 0.664804469273743,\n \"acc_norm_stderr\": 0.015788007190185884\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7875816993464052,\n \"acc_stderr\": 0.02342037547829613,\n \"acc_norm\": 0.7875816993464052,\n \"acc_norm_stderr\": 0.02342037547829613\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.752411575562701,\n \"acc_stderr\": 0.02451387997362197,\n \"acc_norm\": 0.752411575562701,\n \"acc_norm_stderr\": 0.02451387997362197\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0227797190887334,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0227797190887334\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.599290780141844,\n \"acc_stderr\": 0.029233465745573096,\n \"acc_norm\": 0.599290780141844,\n \"acc_norm_stderr\": 0.029233465745573096\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5280312907431551,\n \"acc_stderr\": 0.012750151802922445,\n \"acc_norm\": 0.5280312907431551,\n \"acc_norm_stderr\": 0.012750151802922445\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.027576468622740536,\n \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.027576468622740536\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7238562091503268,\n \"acc_stderr\": 0.018087276935663133,\n \"acc_norm\": 0.7238562091503268,\n \"acc_norm_stderr\": 0.018087276935663133\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291275,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291275\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306042,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.026640582539133196,\n \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.026640582539133196\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38555691554467564,\n \"mc1_stderr\": 0.017038839010591673,\n \"mc2\": 0.5314232366697602,\n \"mc2_stderr\": 0.01519863786002632\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836673\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36694465504169826,\n \"acc_stderr\": 0.013275883047712218\n }\n}\n```", "repo_url": "https://huggingface.co/KaeriJenti/Kaori-34B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|arc:challenge|25_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|arc:challenge|25_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|gsm8k|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|gsm8k|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hellaswag|10_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hellaswag|10_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T19-09-53.896573.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T17-12-07.814018.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["**/details_harness|winogrande|5_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["**/details_harness|winogrande|5_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-24T17-12-07.814018.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T19_09_53.896573", "path": ["results_2023-12-23T19-09-53.896573.parquet"]}, {"split": "2023_12_24T17_12_07.814018", "path": ["results_2023-12-24T17-12-07.814018.parquet"]}, {"split": "latest", "path": ["results_2023-12-24T17-12-07.814018.parquet"]}]}]} | 2023-12-24T17:14:40+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of KaeriJenti/Kaori-34B-v1
Dataset automatically created during the evaluation run of model KaeriJenti/Kaori-34B-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-24T17:12:07.814018(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of KaeriJenti/Kaori-34B-v1\n\n\n\nDataset automatically created during the evaluation run of model KaeriJenti/Kaori-34B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T17:12:07.814018(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KaeriJenti/Kaori-34B-v1\n\n\n\nDataset automatically created during the evaluation run of model KaeriJenti/Kaori-34B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T17:12:07.814018(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KaeriJenti/Kaori-34B-v1\n\n\n\nDataset automatically created during the evaluation run of model KaeriJenti/Kaori-34B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-24T17:12:07.814018(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
51bb508ff202e239c680212edc166cedd0c93e14 |
# Dataset Card for Evaluation run of KaeriJenti/Kaori-34b-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KaeriJenti/Kaori-34b-v2](https://huggingface.co/KaeriJenti/Kaori-34b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KaeriJenti__Kaori-34b-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T19:17:38.902154](https://huggingface.co/datasets/open-llm-leaderboard/details_KaeriJenti__Kaori-34b-v2/blob/main/results_2023-12-23T19-17-38.902154.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2562435688049368,
"acc_stderr": 0.03087677995486888,
"acc_norm": 0.25622099120034325,
"acc_norm_stderr": 0.03166775316506421,
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502346,
"mc2": 0.49462441219025927,
"mc2_stderr": 0.016011015086112988
},
"harness|arc:challenge|25": {
"acc": 0.189419795221843,
"acc_stderr": 0.011450705115910769,
"acc_norm": 0.23890784982935154,
"acc_norm_stderr": 0.012461071376316614
},
"harness|hellaswag|10": {
"acc": 0.27394941246763593,
"acc_stderr": 0.004450718673552667,
"acc_norm": 0.2896833300139414,
"acc_norm_stderr": 0.004526883021027624
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.033911609343436025,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.033911609343436025
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.025447863825108594,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.025447863825108594
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304136,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304136
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.30638297872340425,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.30638297872340425,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.20175438596491227,
"acc_stderr": 0.037752050135836386,
"acc_norm": 0.20175438596491227,
"acc_norm_stderr": 0.037752050135836386
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577657,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.19704433497536947,
"acc_stderr": 0.027986724666736205,
"acc_norm": 0.19704433497536947,
"acc_norm_stderr": 0.027986724666736205
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.032876667586034886,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.032876667586034886
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37823834196891193,
"acc_stderr": 0.03499807276193339,
"acc_norm": 0.37823834196891193,
"acc_norm_stderr": 0.03499807276193339
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128016,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128016
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.026653531596715477,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.026653531596715477
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3155963302752294,
"acc_stderr": 0.019926117513869666,
"acc_norm": 0.3155963302752294,
"acc_norm_stderr": 0.019926117513869666
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.0291575221846056,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.0291575221846056
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.03228210387037892,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.03228210387037892
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22362869198312235,
"acc_stderr": 0.027123298205229972,
"acc_norm": 0.22362869198312235,
"acc_norm_stderr": 0.027123298205229972
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.24663677130044842,
"acc_stderr": 0.028930413120910894,
"acc_norm": 0.24663677130044842,
"acc_norm_stderr": 0.028930413120910894
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596918,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596918
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3140495867768595,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.3140495867768595,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755805,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755805
},
"harness|hendrycksTest-management|5": {
"acc": 0.27184466019417475,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.27184466019417475,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19230769230769232,
"acc_stderr": 0.025819233256483706,
"acc_norm": 0.19230769230769232,
"acc_norm_stderr": 0.025819233256483706
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.27330779054916987,
"acc_stderr": 0.015936681062628556,
"acc_norm": 0.27330779054916987,
"acc_norm_stderr": 0.015936681062628556
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761973,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761973
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2572347266881029,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.2572347266881029,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290396,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23272490221642764,
"acc_stderr": 0.010792595553888496,
"acc_norm": 0.23272490221642764,
"acc_norm_stderr": 0.010792595553888496
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22426470588235295,
"acc_stderr": 0.02533684856333236,
"acc_norm": 0.22426470588235295,
"acc_norm_stderr": 0.02533684856333236
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.017917974069594722,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.017917974069594722
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24897959183673468,
"acc_stderr": 0.02768297952296023,
"acc_norm": 0.24897959183673468,
"acc_norm_stderr": 0.02768297952296023
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.1890547263681592,
"acc_stderr": 0.027686913588013024,
"acc_norm": 0.1890547263681592,
"acc_norm_stderr": 0.027686913588013024
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.26506024096385544,
"acc_stderr": 0.03436024037944966,
"acc_norm": 0.26506024096385544,
"acc_norm_stderr": 0.03436024037944966
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.035650796707083106,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.035650796707083106
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502346,
"mc2": 0.49462441219025927,
"mc2_stderr": 0.016011015086112988
},
"harness|winogrande|5": {
"acc": 0.5722178374112076,
"acc_stderr": 0.013905134013839957
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.0022675371022544905
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_KaeriJenti__Kaori-34b-v2 | [
"region:us"
] | 2023-12-23T19:19:51+00:00 | {"pretty_name": "Evaluation run of KaeriJenti/Kaori-34b-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [KaeriJenti/Kaori-34b-v2](https://huggingface.co/KaeriJenti/Kaori-34b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KaeriJenti__Kaori-34b-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T19:17:38.902154](https://huggingface.co/datasets/open-llm-leaderboard/details_KaeriJenti__Kaori-34b-v2/blob/main/results_2023-12-23T19-17-38.902154.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2562435688049368,\n \"acc_stderr\": 0.03087677995486888,\n \"acc_norm\": 0.25622099120034325,\n \"acc_norm_stderr\": 0.03166775316506421,\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502346,\n \"mc2\": 0.49462441219025927,\n \"mc2_stderr\": 0.016011015086112988\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.189419795221843,\n \"acc_stderr\": 0.011450705115910769,\n \"acc_norm\": 0.23890784982935154,\n \"acc_norm_stderr\": 0.012461071376316614\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27394941246763593,\n \"acc_stderr\": 0.004450718673552667,\n \"acc_norm\": 0.2896833300139414,\n \"acc_norm_stderr\": 0.004526883021027624\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.033911609343436025,\n \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.033911609343436025\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.025447863825108594,\n \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.025447863825108594\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304136,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304136\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.30638297872340425,\n \"acc_stderr\": 0.030135906478517563,\n \"acc_norm\": 0.30638297872340425,\n \"acc_norm_stderr\": 0.030135906478517563\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.20175438596491227,\n \"acc_stderr\": 0.037752050135836386,\n \"acc_norm\": 0.20175438596491227,\n \"acc_norm_stderr\": 0.037752050135836386\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577657,\n \"acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577657\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.19704433497536947,\n \"acc_stderr\": 0.027986724666736205,\n \"acc_norm\": 0.19704433497536947,\n \"acc_norm_stderr\": 0.027986724666736205\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.032876667586034886,\n \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.032876667586034886\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.37823834196891193,\n \"acc_stderr\": 0.03499807276193339,\n \"acc_norm\": 0.37823834196891193,\n \"acc_norm_stderr\": 0.03499807276193339\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128016,\n \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128016\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.026653531596715477,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.026653531596715477\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3155963302752294,\n \"acc_stderr\": 0.019926117513869666,\n \"acc_norm\": 0.3155963302752294,\n \"acc_norm_stderr\": 0.019926117513869666\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.0291575221846056,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.0291575221846056\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.03228210387037892,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.03228210387037892\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.22362869198312235,\n \"acc_stderr\": 0.027123298205229972,\n \"acc_norm\": 0.22362869198312235,\n \"acc_norm_stderr\": 0.027123298205229972\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.24663677130044842,\n \"acc_stderr\": 0.028930413120910894,\n \"acc_norm\": 0.24663677130044842,\n \"acc_norm_stderr\": 0.028930413120910894\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596918,\n \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596918\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.3140495867768595,\n \"acc_stderr\": 0.042369647530410184,\n \"acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.042369647530410184\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.03559039531617342,\n \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.03559039531617342\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.04007341809755805,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.04007341809755805\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.27184466019417475,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.27184466019417475,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19230769230769232,\n \"acc_stderr\": 0.025819233256483706,\n \"acc_norm\": 0.19230769230769232,\n \"acc_norm_stderr\": 0.025819233256483706\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27330779054916987,\n \"acc_stderr\": 0.015936681062628556,\n \"acc_norm\": 0.27330779054916987,\n \"acc_norm_stderr\": 0.015936681062628556\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.02344582627654554,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.02344582627654554\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n \"acc_stderr\": 0.014736926383761973,\n \"acc_norm\": 0.2636871508379888,\n \"acc_norm_stderr\": 0.014736926383761973\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2572347266881029,\n \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.2572347266881029,\n \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290396,\n \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290396\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23272490221642764,\n \"acc_stderr\": 0.010792595553888496,\n \"acc_norm\": 0.23272490221642764,\n \"acc_norm_stderr\": 0.010792595553888496\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.22426470588235295,\n \"acc_stderr\": 0.02533684856333236,\n \"acc_norm\": 0.22426470588235295,\n \"acc_norm_stderr\": 0.02533684856333236\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2679738562091503,\n \"acc_stderr\": 0.017917974069594722,\n \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.017917974069594722\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.02768297952296023,\n \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.02768297952296023\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.1890547263681592,\n \"acc_stderr\": 0.027686913588013024,\n \"acc_norm\": 0.1890547263681592,\n \"acc_norm_stderr\": 0.027686913588013024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.26506024096385544,\n \"acc_stderr\": 0.03436024037944966,\n \"acc_norm\": 0.26506024096385544,\n \"acc_norm_stderr\": 0.03436024037944966\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.035650796707083106,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.035650796707083106\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502346,\n \"mc2\": 0.49462441219025927,\n \"mc2_stderr\": 0.016011015086112988\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5722178374112076,\n \"acc_stderr\": 0.013905134013839957\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006823351023502654,\n \"acc_stderr\": 0.0022675371022544905\n }\n}\n```", "repo_url": "https://huggingface.co/KaeriJenti/Kaori-34b-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|arc:challenge|25_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|gsm8k|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hellaswag|10_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T19-17-38.902154.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["**/details_harness|winogrande|5_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T19-17-38.902154.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T19_17_38.902154", "path": ["results_2023-12-23T19-17-38.902154.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T19-17-38.902154.parquet"]}]}]} | 2023-12-23T19:20:13+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of KaeriJenti/Kaori-34b-v2
Dataset automatically created during the evaluation run of model KaeriJenti/Kaori-34b-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T19:17:38.902154(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of KaeriJenti/Kaori-34b-v2\n\n\n\nDataset automatically created during the evaluation run of model KaeriJenti/Kaori-34b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T19:17:38.902154(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KaeriJenti/Kaori-34b-v2\n\n\n\nDataset automatically created during the evaluation run of model KaeriJenti/Kaori-34b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T19:17:38.902154(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KaeriJenti/Kaori-34b-v2\n\n\n\nDataset automatically created during the evaluation run of model KaeriJenti/Kaori-34b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T19:17:38.902154(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
cdab6ae2466084c332b3269900f317b21284fd62 | # Dataset Card for "calibrated_3channel_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gdurkin/calibrated_3channel_train | [
"region:us"
] | 2023-12-23T19:30:57+00:00 | {"dataset_info": {"features": [{"name": "label", "dtype": "image"}, {"name": "pixel_values", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 457793398.02, "num_examples": 1873}], "download_size": 456301474, "dataset_size": 457793398.02}} | 2023-12-27T04:36:47+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "calibrated_3channel_train"
More Information needed | [
"# Dataset Card for \"calibrated_3channel_train\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"calibrated_3channel_train\"\n\nMore Information needed"
] | [
6,
20
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"calibrated_3channel_train\"\n\nMore Information needed"
] |
f5fbddc81aaf5a04edd7008cfb87729d6b8f9eb2 | # Dataset Card for "calibrated_3channel_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gdurkin/calibrated_3channel_test | [
"region:us"
] | 2023-12-23T19:31:02+00:00 | {"dataset_info": {"features": [{"name": "label", "dtype": "image"}, {"name": "pixel_values", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 164844271.0, "num_examples": 671}], "download_size": 164212556, "dataset_size": 164844271.0}} | 2023-12-27T04:36:56+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "calibrated_3channel_test"
More Information needed | [
"# Dataset Card for \"calibrated_3channel_test\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"calibrated_3channel_test\"\n\nMore Information needed"
] | [
6,
19
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"calibrated_3channel_test\"\n\nMore Information needed"
] |
2ef12699e3059924b6d940d1517511d62b0d10f7 |
# Dataset Card for Evaluation run of YeungNLP/firefly-mixtral-8x7b-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YeungNLP/firefly-mixtral-8x7b-v1](https://huggingface.co/YeungNLP/firefly-mixtral-8x7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YeungNLP__firefly-mixtral-8x7b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T20:18:39.786193](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-mixtral-8x7b-v1/blob/main/results_2023-12-23T20-18-39.786193.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7128522987523449,
"acc_stderr": 0.030245263140979715,
"acc_norm": 0.7167785241964734,
"acc_norm_stderr": 0.03083288189023405,
"mc1": 0.40269277845777235,
"mc1_stderr": 0.017168830935187222,
"mc2": 0.553071814559571,
"mc2_stderr": 0.0151346546936277
},
"harness|arc:challenge|25": {
"acc": 0.6527303754266212,
"acc_stderr": 0.013913034529620448,
"acc_norm": 0.6808873720136519,
"acc_norm_stderr": 0.013621696119173306
},
"harness|hellaswag|10": {
"acc": 0.6661023700458076,
"acc_stderr": 0.0047063982523824635,
"acc_norm": 0.8575980880302728,
"acc_norm_stderr": 0.0034874768122805247
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977109,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977109
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7849056603773585,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.7849056603773585,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8541666666666666,
"acc_stderr": 0.029514245964291766,
"acc_norm": 0.8541666666666666,
"acc_norm_stderr": 0.029514245964291766
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.03476599607516478,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.03476599607516478
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6808510638297872,
"acc_stderr": 0.030472973363380042,
"acc_norm": 0.6808510638297872,
"acc_norm_stderr": 0.030472973363380042
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6403508771929824,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.6403508771929824,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6689655172413793,
"acc_stderr": 0.039215453124671215,
"acc_norm": 0.6689655172413793,
"acc_norm_stderr": 0.039215453124671215
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48677248677248675,
"acc_stderr": 0.025742297289575142,
"acc_norm": 0.48677248677248675,
"acc_norm_stderr": 0.025742297289575142
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.02188617856717253,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.02188617856717253
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6403940886699507,
"acc_stderr": 0.03376458246509567,
"acc_norm": 0.6403940886699507,
"acc_norm_stderr": 0.03376458246509567
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880232,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880232
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7076923076923077,
"acc_stderr": 0.023060438380857744,
"acc_norm": 0.7076923076923077,
"acc_norm_stderr": 0.023060438380857744
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.029773847012532967,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.029773847012532967
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.026265024608275886,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.026265024608275886
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.040802441856289694,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.040802441856289694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8844036697247707,
"acc_stderr": 0.01370874953417264,
"acc_norm": 0.8844036697247707,
"acc_norm_stderr": 0.01370874953417264
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.03275773486100999,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.03275773486100999
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801588,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801588
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8396624472573839,
"acc_stderr": 0.02388438092596567,
"acc_norm": 0.8396624472573839,
"acc_norm_stderr": 0.02388438092596567
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7623318385650224,
"acc_stderr": 0.02856807946471429,
"acc_norm": 0.7623318385650224,
"acc_norm_stderr": 0.02856807946471429
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.03278548537343138,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.03278548537343138
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631002,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631002
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.01700436856813236,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.01700436856813236
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8837803320561941,
"acc_stderr": 0.011460632981922894,
"acc_norm": 0.8837803320561941,
"acc_norm_stderr": 0.011460632981922894
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.021855255263421795,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.021855255263421795
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.0165638293990477,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.0165638293990477
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.02273378940544759,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.02273378940544759
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7813504823151125,
"acc_stderr": 0.02347558141786111,
"acc_norm": 0.7813504823151125,
"acc_norm_stderr": 0.02347558141786111
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.021038517770157365,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.021038517770157365
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5354609929078015,
"acc_stderr": 0.029752389657427054,
"acc_norm": 0.5354609929078015,
"acc_norm_stderr": 0.029752389657427054
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.529986962190352,
"acc_stderr": 0.012747248967079045,
"acc_norm": 0.529986962190352,
"acc_norm_stderr": 0.012747248967079045
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7977941176470589,
"acc_stderr": 0.024398192986654924,
"acc_norm": 0.7977941176470589,
"acc_norm_stderr": 0.024398192986654924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.01699272346546623,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.01699272346546623
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101713,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101713
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40269277845777235,
"mc1_stderr": 0.017168830935187222,
"mc2": 0.553071814559571,
"mc2_stderr": 0.0151346546936277
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047986
},
"harness|gsm8k|5": {
"acc": 0.5928733889310084,
"acc_stderr": 0.013532811069356528
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_YeungNLP__firefly-mixtral-8x7b-v1 | [
"region:us"
] | 2023-12-23T20:20:57+00:00 | {"pretty_name": "Evaluation run of YeungNLP/firefly-mixtral-8x7b-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [YeungNLP/firefly-mixtral-8x7b-v1](https://huggingface.co/YeungNLP/firefly-mixtral-8x7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YeungNLP__firefly-mixtral-8x7b-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T20:18:39.786193](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-mixtral-8x7b-v1/blob/main/results_2023-12-23T20-18-39.786193.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7128522987523449,\n \"acc_stderr\": 0.030245263140979715,\n \"acc_norm\": 0.7167785241964734,\n \"acc_norm_stderr\": 0.03083288189023405,\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.017168830935187222,\n \"mc2\": 0.553071814559571,\n \"mc2_stderr\": 0.0151346546936277\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6527303754266212,\n \"acc_stderr\": 0.013913034529620448,\n \"acc_norm\": 0.6808873720136519,\n \"acc_norm_stderr\": 0.013621696119173306\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6661023700458076,\n \"acc_stderr\": 0.0047063982523824635,\n \"acc_norm\": 0.8575980880302728,\n \"acc_norm_stderr\": 0.0034874768122805247\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n \"acc_stderr\": 0.04024778401977109,\n \"acc_norm\": 0.6814814814814815,\n \"acc_norm_stderr\": 0.04024778401977109\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.031975658210325,\n \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.031975658210325\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7849056603773585,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.7849056603773585,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8541666666666666,\n \"acc_stderr\": 0.029514245964291766,\n \"acc_norm\": 0.8541666666666666,\n \"acc_norm_stderr\": 0.029514245964291766\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.03476599607516478,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.03476599607516478\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6808510638297872,\n \"acc_stderr\": 0.030472973363380042,\n \"acc_norm\": 0.6808510638297872,\n \"acc_norm_stderr\": 0.030472973363380042\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6403508771929824,\n \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.6403508771929824,\n \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6689655172413793,\n \"acc_stderr\": 0.039215453124671215,\n \"acc_norm\": 0.6689655172413793,\n \"acc_norm_stderr\": 0.039215453124671215\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.48677248677248675,\n \"acc_stderr\": 0.025742297289575142,\n \"acc_norm\": 0.48677248677248675,\n \"acc_norm_stderr\": 0.025742297289575142\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n \"acc_stderr\": 0.02188617856717253,\n \"acc_norm\": 0.8193548387096774,\n \"acc_norm_stderr\": 0.02188617856717253\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6403940886699507,\n \"acc_stderr\": 0.03376458246509567,\n \"acc_norm\": 0.6403940886699507,\n \"acc_norm_stderr\": 0.03376458246509567\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880232,\n \"acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880232\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7076923076923077,\n \"acc_stderr\": 0.023060438380857744,\n \"acc_norm\": 0.7076923076923077,\n \"acc_norm_stderr\": 0.023060438380857744\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3925925925925926,\n \"acc_stderr\": 0.029773847012532967,\n \"acc_norm\": 0.3925925925925926,\n \"acc_norm_stderr\": 0.029773847012532967\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.026265024608275886,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.026265024608275886\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.48344370860927155,\n \"acc_stderr\": 0.040802441856289694,\n \"acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.040802441856289694\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8844036697247707,\n \"acc_stderr\": 0.01370874953417264,\n \"acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.01370874953417264\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.03275773486100999,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.03275773486100999\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801588,\n \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801588\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8396624472573839,\n \"acc_stderr\": 0.02388438092596567,\n \"acc_norm\": 0.8396624472573839,\n \"acc_norm_stderr\": 0.02388438092596567\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7623318385650224,\n \"acc_stderr\": 0.02856807946471429,\n \"acc_norm\": 0.7623318385650224,\n \"acc_norm_stderr\": 0.02856807946471429\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.03278548537343138,\n \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.03278548537343138\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243631002,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243631002\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n \"acc_stderr\": 0.01700436856813236,\n \"acc_norm\": 0.9273504273504274,\n \"acc_norm_stderr\": 0.01700436856813236\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8837803320561941,\n \"acc_stderr\": 0.011460632981922894,\n \"acc_norm\": 0.8837803320561941,\n \"acc_norm_stderr\": 0.011460632981922894\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.021855255263421795,\n \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.021855255263421795\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n \"acc_stderr\": 0.0165638293990477,\n \"acc_norm\": 0.4312849162011173,\n \"acc_norm_stderr\": 0.0165638293990477\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.02273378940544759,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.02273378940544759\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n \"acc_stderr\": 0.02347558141786111,\n \"acc_norm\": 0.7813504823151125,\n \"acc_norm_stderr\": 0.02347558141786111\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.021038517770157365,\n \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.021038517770157365\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5354609929078015,\n \"acc_stderr\": 0.029752389657427054,\n \"acc_norm\": 0.5354609929078015,\n \"acc_norm_stderr\": 0.029752389657427054\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.529986962190352,\n \"acc_stderr\": 0.012747248967079045,\n \"acc_norm\": 0.529986962190352,\n \"acc_norm_stderr\": 0.012747248967079045\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7977941176470589,\n \"acc_stderr\": 0.024398192986654924,\n \"acc_norm\": 0.7977941176470589,\n \"acc_norm_stderr\": 0.024398192986654924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7712418300653595,\n \"acc_stderr\": 0.01699272346546623,\n \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.01699272346546623\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866767,\n \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866767\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101713,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101713\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.017168830935187222,\n \"mc2\": 0.553071814559571,\n \"mc2_stderr\": 0.0151346546936277\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047986\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5928733889310084,\n \"acc_stderr\": 0.013532811069356528\n }\n}\n```", "repo_url": "https://huggingface.co/YeungNLP/firefly-mixtral-8x7b-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|arc:challenge|25_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|gsm8k|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hellaswag|10_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T20-18-39.786193.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["**/details_harness|winogrande|5_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T20-18-39.786193.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T20_18_39.786193", "path": ["results_2023-12-23T20-18-39.786193.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T20-18-39.786193.parquet"]}]}]} | 2023-12-23T20:21:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of YeungNLP/firefly-mixtral-8x7b-v1
Dataset automatically created during the evaluation run of model YeungNLP/firefly-mixtral-8x7b-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T20:18:39.786193(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of YeungNLP/firefly-mixtral-8x7b-v1\n\n\n\nDataset automatically created during the evaluation run of model YeungNLP/firefly-mixtral-8x7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T20:18:39.786193(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of YeungNLP/firefly-mixtral-8x7b-v1\n\n\n\nDataset automatically created during the evaluation run of model YeungNLP/firefly-mixtral-8x7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T20:18:39.786193(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of YeungNLP/firefly-mixtral-8x7b-v1\n\n\n\nDataset automatically created during the evaluation run of model YeungNLP/firefly-mixtral-8x7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T20:18:39.786193(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
320f6153a36ad49e8e9537f97fbf16d66e366ce1 |
# Dataset Card for Evaluation run of KaeriJenti/kaori-34b-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KaeriJenti/kaori-34b-v3](https://huggingface.co/KaeriJenti/kaori-34b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KaeriJenti__kaori-34b-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T20:18:59.235180](https://huggingface.co/datasets/open-llm-leaderboard/details_KaeriJenti__kaori-34b-v3/blob/main/results_2023-12-23T20-18-59.235180.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6942406707070711,
"acc_stderr": 0.030569400022157234,
"acc_norm": 0.702379645260744,
"acc_norm_stderr": 0.031160918444313776,
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.5236714287384441,
"mc2_stderr": 0.015221488743934338
},
"harness|arc:challenge|25": {
"acc": 0.613481228668942,
"acc_stderr": 0.014230084761910471,
"acc_norm": 0.6424914675767918,
"acc_norm_stderr": 0.014005494275916573
},
"harness|hellaswag|10": {
"acc": 0.6059549890460068,
"acc_stderr": 0.004876459434619799,
"acc_norm": 0.795857398924517,
"acc_norm_stderr": 0.004022499210760733
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.040247784019771096,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.040247784019771096
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8289473684210527,
"acc_stderr": 0.03064360707167709,
"acc_norm": 0.8289473684210527,
"acc_norm_stderr": 0.03064360707167709
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653696,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653696
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7773584905660378,
"acc_stderr": 0.025604233470899098,
"acc_norm": 0.7773584905660378,
"acc_norm_stderr": 0.025604233470899098
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565666,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.03496101481191179,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.03496101481191179
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.04971358884367406,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.04971358884367406
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7063829787234043,
"acc_stderr": 0.029771642712491227,
"acc_norm": 0.7063829787234043,
"acc_norm_stderr": 0.029771642712491227
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.03724563619774632,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.03724563619774632
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5476190476190477,
"acc_stderr": 0.02563425811555495,
"acc_norm": 0.5476190476190477,
"acc_norm_stderr": 0.02563425811555495
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.043902592653775614,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.043902592653775614
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.832258064516129,
"acc_stderr": 0.021255464065371318,
"acc_norm": 0.832258064516129,
"acc_norm_stderr": 0.021255464065371318
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5566502463054187,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.5566502463054187,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.02325315795194208,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.02325315795194208
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607565,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607565
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7128205128205128,
"acc_stderr": 0.022939925418530616,
"acc_norm": 0.7128205128205128,
"acc_norm_stderr": 0.022939925418530616
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.02950286112895529,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.02950286112895529
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7899159663865546,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.7899159663865546,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8917431192660551,
"acc_stderr": 0.013321348447611759,
"acc_norm": 0.8917431192660551,
"acc_norm_stderr": 0.013321348447611759
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02675640153807897,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02675640153807897
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.03050028317654585,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.03050028317654585
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.032785485373431386,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.032785485373431386
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.01872430174194165,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.01872430174194165
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8531289910600255,
"acc_stderr": 0.012658201736147297,
"acc_norm": 0.8531289910600255,
"acc_norm_stderr": 0.012658201736147297
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6424581005586593,
"acc_stderr": 0.016029394474894886,
"acc_norm": 0.6424581005586593,
"acc_norm_stderr": 0.016029394474894886
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7875816993464052,
"acc_stderr": 0.02342037547829613,
"acc_norm": 0.7875816993464052,
"acc_norm_stderr": 0.02342037547829613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.752411575562701,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.752411575562701,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262192,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262192
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6028368794326241,
"acc_stderr": 0.0291898056735871,
"acc_norm": 0.6028368794326241,
"acc_norm_stderr": 0.0291898056735871
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5247718383311604,
"acc_stderr": 0.012754553719781753,
"acc_norm": 0.5247718383311604,
"acc_norm_stderr": 0.012754553719781753
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.027257202606114944,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.027257202606114944
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7271241830065359,
"acc_stderr": 0.01802047414839358,
"acc_norm": 0.7271241830065359,
"acc_norm_stderr": 0.01802047414839358
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249765,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249765
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306042,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.026640582539133196,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.026640582539133196
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.5236714287384441,
"mc2_stderr": 0.015221488743934338
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650875
},
"harness|gsm8k|5": {
"acc": 0.36239575435936316,
"acc_stderr": 0.013240654263574776
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_KaeriJenti__kaori-34b-v3 | [
"region:us"
] | 2023-12-23T20:21:10+00:00 | {"pretty_name": "Evaluation run of KaeriJenti/kaori-34b-v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [KaeriJenti/kaori-34b-v3](https://huggingface.co/KaeriJenti/kaori-34b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KaeriJenti__kaori-34b-v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T20:18:59.235180](https://huggingface.co/datasets/open-llm-leaderboard/details_KaeriJenti__kaori-34b-v3/blob/main/results_2023-12-23T20-18-59.235180.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6942406707070711,\n \"acc_stderr\": 0.030569400022157234,\n \"acc_norm\": 0.702379645260744,\n \"acc_norm_stderr\": 0.031160918444313776,\n \"mc1\": 0.37821297429620565,\n \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.5236714287384441,\n \"mc2_stderr\": 0.015221488743934338\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.613481228668942,\n \"acc_stderr\": 0.014230084761910471,\n \"acc_norm\": 0.6424914675767918,\n \"acc_norm_stderr\": 0.014005494275916573\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6059549890460068,\n \"acc_stderr\": 0.004876459434619799,\n \"acc_norm\": 0.795857398924517,\n \"acc_norm_stderr\": 0.004022499210760733\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.6814814814814815,\n \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.03064360707167709,\n \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.03064360707167709\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653696,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653696\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7773584905660378,\n \"acc_stderr\": 0.025604233470899098,\n \"acc_norm\": 0.7773584905660378,\n \"acc_norm_stderr\": 0.025604233470899098\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n \"acc_stderr\": 0.030085743248565666,\n \"acc_norm\": 0.8472222222222222,\n \"acc_norm_stderr\": 0.030085743248565666\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.03496101481191179,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.03496101481191179\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.04971358884367406,\n \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.04971358884367406\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7063829787234043,\n \"acc_stderr\": 0.029771642712491227,\n \"acc_norm\": 0.7063829787234043,\n \"acc_norm_stderr\": 0.029771642712491227\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.03724563619774632,\n \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.03724563619774632\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5476190476190477,\n \"acc_stderr\": 0.02563425811555495,\n \"acc_norm\": 0.5476190476190477,\n \"acc_norm_stderr\": 0.02563425811555495\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.043902592653775614,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.043902592653775614\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.832258064516129,\n \"acc_stderr\": 0.021255464065371318,\n \"acc_norm\": 0.832258064516129,\n \"acc_norm_stderr\": 0.021255464065371318\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5566502463054187,\n \"acc_stderr\": 0.03495334582162934,\n \"acc_norm\": 0.5566502463054187,\n \"acc_norm_stderr\": 0.03495334582162934\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.02325315795194208,\n \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.02325315795194208\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607565,\n \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607565\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7128205128205128,\n \"acc_stderr\": 0.022939925418530616,\n \"acc_norm\": 0.7128205128205128,\n \"acc_norm_stderr\": 0.022939925418530616\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37407407407407406,\n \"acc_stderr\": 0.02950286112895529,\n \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.02950286112895529\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7899159663865546,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.7899159663865546,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8917431192660551,\n \"acc_stderr\": 0.013321348447611759,\n \"acc_norm\": 0.8917431192660551,\n \"acc_norm_stderr\": 0.013321348447611759\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02675640153807897,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02675640153807897\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.032785485373431386,\n \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.032785485373431386\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n \"acc_stderr\": 0.01872430174194165,\n \"acc_norm\": 0.9102564102564102,\n \"acc_norm_stderr\": 0.01872430174194165\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8531289910600255,\n \"acc_stderr\": 0.012658201736147297,\n \"acc_norm\": 0.8531289910600255,\n \"acc_norm_stderr\": 0.012658201736147297\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6424581005586593,\n \"acc_stderr\": 0.016029394474894886,\n \"acc_norm\": 0.6424581005586593,\n \"acc_norm_stderr\": 0.016029394474894886\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7875816993464052,\n \"acc_stderr\": 0.02342037547829613,\n \"acc_norm\": 0.7875816993464052,\n \"acc_norm_stderr\": 0.02342037547829613\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.752411575562701,\n \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.752411575562701,\n \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262192,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262192\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6028368794326241,\n \"acc_stderr\": 0.0291898056735871,\n \"acc_norm\": 0.6028368794326241,\n \"acc_norm_stderr\": 0.0291898056735871\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5247718383311604,\n \"acc_stderr\": 0.012754553719781753,\n \"acc_norm\": 0.5247718383311604,\n \"acc_norm_stderr\": 0.012754553719781753\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.027257202606114944,\n \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.027257202606114944\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7271241830065359,\n \"acc_stderr\": 0.01802047414839358,\n \"acc_norm\": 0.7271241830065359,\n \"acc_norm_stderr\": 0.01802047414839358\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249765,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249765\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306042,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.026640582539133196,\n \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.026640582539133196\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37821297429620565,\n \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.5236714287384441,\n \"mc2_stderr\": 0.015221488743934338\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650875\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36239575435936316,\n \"acc_stderr\": 0.013240654263574776\n }\n}\n```", "repo_url": "https://huggingface.co/KaeriJenti/kaori-34b-v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|arc:challenge|25_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|gsm8k|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hellaswag|10_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T20-18-59.235180.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["**/details_harness|winogrande|5_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T20-18-59.235180.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T20_18_59.235180", "path": ["results_2023-12-23T20-18-59.235180.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T20-18-59.235180.parquet"]}]}]} | 2023-12-23T20:21:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of KaeriJenti/kaori-34b-v3
Dataset automatically created during the evaluation run of model KaeriJenti/kaori-34b-v3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T20:18:59.235180(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of KaeriJenti/kaori-34b-v3\n\n\n\nDataset automatically created during the evaluation run of model KaeriJenti/kaori-34b-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T20:18:59.235180(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KaeriJenti/kaori-34b-v3\n\n\n\nDataset automatically created during the evaluation run of model KaeriJenti/kaori-34b-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T20:18:59.235180(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KaeriJenti/kaori-34b-v3\n\n\n\nDataset automatically created during the evaluation run of model KaeriJenti/kaori-34b-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T20:18:59.235180(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
7b8676e6d8e7f61352bc25fabdb7b9ed1a8fb206 |
# Dataset Card for Evaluation run of KaeriJenti/kaori-34b-v4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KaeriJenti/kaori-34b-v4](https://huggingface.co/KaeriJenti/kaori-34b-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KaeriJenti__kaori-34b-v4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T20:41:54.627172](https://huggingface.co/datasets/open-llm-leaderboard/details_KaeriJenti__kaori-34b-v4/blob/main/results_2023-12-23T20-41-54.627172.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2562435688049368,
"acc_stderr": 0.03087677995486888,
"acc_norm": 0.25622099120034325,
"acc_norm_stderr": 0.03166775316506421,
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502346,
"mc2": 0.49462441219025927,
"mc2_stderr": 0.016011015086112988
},
"harness|arc:challenge|25": {
"acc": 0.189419795221843,
"acc_stderr": 0.011450705115910769,
"acc_norm": 0.23890784982935154,
"acc_norm_stderr": 0.012461071376316614
},
"harness|hellaswag|10": {
"acc": 0.27394941246763593,
"acc_stderr": 0.004450718673552667,
"acc_norm": 0.2896833300139414,
"acc_norm_stderr": 0.004526883021027624
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.033911609343436025,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.033911609343436025
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.025447863825108594,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.025447863825108594
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304136,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304136
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.30638297872340425,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.30638297872340425,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.20175438596491227,
"acc_stderr": 0.037752050135836386,
"acc_norm": 0.20175438596491227,
"acc_norm_stderr": 0.037752050135836386
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577657,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.19704433497536947,
"acc_stderr": 0.027986724666736205,
"acc_norm": 0.19704433497536947,
"acc_norm_stderr": 0.027986724666736205
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.032876667586034886,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.032876667586034886
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37823834196891193,
"acc_stderr": 0.03499807276193339,
"acc_norm": 0.37823834196891193,
"acc_norm_stderr": 0.03499807276193339
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128016,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128016
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.026653531596715477,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.026653531596715477
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3155963302752294,
"acc_stderr": 0.019926117513869666,
"acc_norm": 0.3155963302752294,
"acc_norm_stderr": 0.019926117513869666
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.0291575221846056,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.0291575221846056
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.03228210387037892,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.03228210387037892
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22362869198312235,
"acc_stderr": 0.027123298205229972,
"acc_norm": 0.22362869198312235,
"acc_norm_stderr": 0.027123298205229972
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.24663677130044842,
"acc_stderr": 0.028930413120910894,
"acc_norm": 0.24663677130044842,
"acc_norm_stderr": 0.028930413120910894
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596918,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596918
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3140495867768595,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.3140495867768595,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755805,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755805
},
"harness|hendrycksTest-management|5": {
"acc": 0.27184466019417475,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.27184466019417475,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19230769230769232,
"acc_stderr": 0.025819233256483706,
"acc_norm": 0.19230769230769232,
"acc_norm_stderr": 0.025819233256483706
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.27330779054916987,
"acc_stderr": 0.015936681062628556,
"acc_norm": 0.27330779054916987,
"acc_norm_stderr": 0.015936681062628556
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761973,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761973
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2572347266881029,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.2572347266881029,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290396,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23272490221642764,
"acc_stderr": 0.010792595553888496,
"acc_norm": 0.23272490221642764,
"acc_norm_stderr": 0.010792595553888496
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22426470588235295,
"acc_stderr": 0.02533684856333236,
"acc_norm": 0.22426470588235295,
"acc_norm_stderr": 0.02533684856333236
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.017917974069594722,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.017917974069594722
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24897959183673468,
"acc_stderr": 0.02768297952296023,
"acc_norm": 0.24897959183673468,
"acc_norm_stderr": 0.02768297952296023
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.1890547263681592,
"acc_stderr": 0.027686913588013024,
"acc_norm": 0.1890547263681592,
"acc_norm_stderr": 0.027686913588013024
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.26506024096385544,
"acc_stderr": 0.03436024037944966,
"acc_norm": 0.26506024096385544,
"acc_norm_stderr": 0.03436024037944966
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.035650796707083106,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.035650796707083106
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502346,
"mc2": 0.49462441219025927,
"mc2_stderr": 0.016011015086112988
},
"harness|winogrande|5": {
"acc": 0.5722178374112076,
"acc_stderr": 0.013905134013839957
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.0022675371022544905
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_KaeriJenti__kaori-34b-v4 | [
"region:us"
] | 2023-12-23T20:44:06+00:00 | {"pretty_name": "Evaluation run of KaeriJenti/kaori-34b-v4", "dataset_summary": "Dataset automatically created during the evaluation run of model [KaeriJenti/kaori-34b-v4](https://huggingface.co/KaeriJenti/kaori-34b-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KaeriJenti__kaori-34b-v4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T20:41:54.627172](https://huggingface.co/datasets/open-llm-leaderboard/details_KaeriJenti__kaori-34b-v4/blob/main/results_2023-12-23T20-41-54.627172.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2562435688049368,\n \"acc_stderr\": 0.03087677995486888,\n \"acc_norm\": 0.25622099120034325,\n \"acc_norm_stderr\": 0.03166775316506421,\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502346,\n \"mc2\": 0.49462441219025927,\n \"mc2_stderr\": 0.016011015086112988\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.189419795221843,\n \"acc_stderr\": 0.011450705115910769,\n \"acc_norm\": 0.23890784982935154,\n \"acc_norm_stderr\": 0.012461071376316614\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27394941246763593,\n \"acc_stderr\": 0.004450718673552667,\n \"acc_norm\": 0.2896833300139414,\n \"acc_norm_stderr\": 0.004526883021027624\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.033911609343436025,\n \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.033911609343436025\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.025447863825108594,\n \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.025447863825108594\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304136,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304136\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.30638297872340425,\n \"acc_stderr\": 0.030135906478517563,\n \"acc_norm\": 0.30638297872340425,\n \"acc_norm_stderr\": 0.030135906478517563\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.20175438596491227,\n \"acc_stderr\": 0.037752050135836386,\n \"acc_norm\": 0.20175438596491227,\n \"acc_norm_stderr\": 0.037752050135836386\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577657,\n \"acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577657\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.19704433497536947,\n \"acc_stderr\": 0.027986724666736205,\n \"acc_norm\": 0.19704433497536947,\n \"acc_norm_stderr\": 0.027986724666736205\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.032876667586034886,\n \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.032876667586034886\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.37823834196891193,\n \"acc_stderr\": 0.03499807276193339,\n \"acc_norm\": 0.37823834196891193,\n \"acc_norm_stderr\": 0.03499807276193339\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128016,\n \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128016\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.026653531596715477,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.026653531596715477\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3155963302752294,\n \"acc_stderr\": 0.019926117513869666,\n \"acc_norm\": 0.3155963302752294,\n \"acc_norm_stderr\": 0.019926117513869666\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.0291575221846056,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.0291575221846056\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.03228210387037892,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.03228210387037892\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.22362869198312235,\n \"acc_stderr\": 0.027123298205229972,\n \"acc_norm\": 0.22362869198312235,\n \"acc_norm_stderr\": 0.027123298205229972\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.24663677130044842,\n \"acc_stderr\": 0.028930413120910894,\n \"acc_norm\": 0.24663677130044842,\n \"acc_norm_stderr\": 0.028930413120910894\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596918,\n \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596918\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.3140495867768595,\n \"acc_stderr\": 0.042369647530410184,\n \"acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.042369647530410184\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.03559039531617342,\n \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.03559039531617342\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.04007341809755805,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.04007341809755805\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.27184466019417475,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.27184466019417475,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19230769230769232,\n \"acc_stderr\": 0.025819233256483706,\n \"acc_norm\": 0.19230769230769232,\n \"acc_norm_stderr\": 0.025819233256483706\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27330779054916987,\n \"acc_stderr\": 0.015936681062628556,\n \"acc_norm\": 0.27330779054916987,\n \"acc_norm_stderr\": 0.015936681062628556\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.02344582627654554,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.02344582627654554\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n \"acc_stderr\": 0.014736926383761973,\n \"acc_norm\": 0.2636871508379888,\n \"acc_norm_stderr\": 0.014736926383761973\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2572347266881029,\n \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.2572347266881029,\n \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290396,\n \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290396\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23272490221642764,\n \"acc_stderr\": 0.010792595553888496,\n \"acc_norm\": 0.23272490221642764,\n \"acc_norm_stderr\": 0.010792595553888496\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.22426470588235295,\n \"acc_stderr\": 0.02533684856333236,\n \"acc_norm\": 0.22426470588235295,\n \"acc_norm_stderr\": 0.02533684856333236\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2679738562091503,\n \"acc_stderr\": 0.017917974069594722,\n \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.017917974069594722\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.02768297952296023,\n \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.02768297952296023\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.1890547263681592,\n \"acc_stderr\": 0.027686913588013024,\n \"acc_norm\": 0.1890547263681592,\n \"acc_norm_stderr\": 0.027686913588013024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.26506024096385544,\n \"acc_stderr\": 0.03436024037944966,\n \"acc_norm\": 0.26506024096385544,\n \"acc_norm_stderr\": 0.03436024037944966\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.035650796707083106,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.035650796707083106\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502346,\n \"mc2\": 0.49462441219025927,\n \"mc2_stderr\": 0.016011015086112988\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5722178374112076,\n \"acc_stderr\": 0.013905134013839957\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006823351023502654,\n \"acc_stderr\": 0.0022675371022544905\n }\n}\n```", "repo_url": "https://huggingface.co/KaeriJenti/kaori-34b-v4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|arc:challenge|25_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|gsm8k|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hellaswag|10_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T20-41-54.627172.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["**/details_harness|winogrande|5_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T20-41-54.627172.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T20_41_54.627172", "path": ["results_2023-12-23T20-41-54.627172.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T20-41-54.627172.parquet"]}]}]} | 2023-12-23T20:44:26+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of KaeriJenti/kaori-34b-v4
Dataset automatically created during the evaluation run of model KaeriJenti/kaori-34b-v4 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T20:41:54.627172(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of KaeriJenti/kaori-34b-v4\n\n\n\nDataset automatically created during the evaluation run of model KaeriJenti/kaori-34b-v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T20:41:54.627172(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KaeriJenti/kaori-34b-v4\n\n\n\nDataset automatically created during the evaluation run of model KaeriJenti/kaori-34b-v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T20:41:54.627172(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KaeriJenti/kaori-34b-v4\n\n\n\nDataset automatically created during the evaluation run of model KaeriJenti/kaori-34b-v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T20:41:54.627172(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
a3da7039ca269e7b04baf10b2b2a221380d5c004 |
# Dataset Card for Evaluation run of YeungNLP/firefly-mixtral-8x7b-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YeungNLP/firefly-mixtral-8x7b-v0.1](https://huggingface.co/YeungNLP/firefly-mixtral-8x7b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YeungNLP__firefly-mixtral-8x7b-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T21:17:56.230718](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-mixtral-8x7b-v0.1/blob/main/results_2023-12-23T21-17-56.230718.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7128522987523449,
"acc_stderr": 0.030245263140979715,
"acc_norm": 0.7167785241964734,
"acc_norm_stderr": 0.03083288189023405,
"mc1": 0.40269277845777235,
"mc1_stderr": 0.017168830935187222,
"mc2": 0.553071814559571,
"mc2_stderr": 0.0151346546936277
},
"harness|arc:challenge|25": {
"acc": 0.6527303754266212,
"acc_stderr": 0.013913034529620448,
"acc_norm": 0.6808873720136519,
"acc_norm_stderr": 0.013621696119173306
},
"harness|hellaswag|10": {
"acc": 0.6661023700458076,
"acc_stderr": 0.0047063982523824635,
"acc_norm": 0.8575980880302728,
"acc_norm_stderr": 0.0034874768122805247
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977109,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977109
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7849056603773585,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.7849056603773585,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8541666666666666,
"acc_stderr": 0.029514245964291766,
"acc_norm": 0.8541666666666666,
"acc_norm_stderr": 0.029514245964291766
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.03476599607516478,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.03476599607516478
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6808510638297872,
"acc_stderr": 0.030472973363380042,
"acc_norm": 0.6808510638297872,
"acc_norm_stderr": 0.030472973363380042
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6403508771929824,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.6403508771929824,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6689655172413793,
"acc_stderr": 0.039215453124671215,
"acc_norm": 0.6689655172413793,
"acc_norm_stderr": 0.039215453124671215
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48677248677248675,
"acc_stderr": 0.025742297289575142,
"acc_norm": 0.48677248677248675,
"acc_norm_stderr": 0.025742297289575142
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.02188617856717253,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.02188617856717253
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6403940886699507,
"acc_stderr": 0.03376458246509567,
"acc_norm": 0.6403940886699507,
"acc_norm_stderr": 0.03376458246509567
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880232,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880232
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7076923076923077,
"acc_stderr": 0.023060438380857744,
"acc_norm": 0.7076923076923077,
"acc_norm_stderr": 0.023060438380857744
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.029773847012532967,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.029773847012532967
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.026265024608275886,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.026265024608275886
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.040802441856289694,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.040802441856289694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8844036697247707,
"acc_stderr": 0.01370874953417264,
"acc_norm": 0.8844036697247707,
"acc_norm_stderr": 0.01370874953417264
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.03275773486100999,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.03275773486100999
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801588,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801588
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8396624472573839,
"acc_stderr": 0.02388438092596567,
"acc_norm": 0.8396624472573839,
"acc_norm_stderr": 0.02388438092596567
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7623318385650224,
"acc_stderr": 0.02856807946471429,
"acc_norm": 0.7623318385650224,
"acc_norm_stderr": 0.02856807946471429
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.03278548537343138,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.03278548537343138
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631002,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631002
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.01700436856813236,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.01700436856813236
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8837803320561941,
"acc_stderr": 0.011460632981922894,
"acc_norm": 0.8837803320561941,
"acc_norm_stderr": 0.011460632981922894
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.021855255263421795,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.021855255263421795
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.0165638293990477,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.0165638293990477
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.02273378940544759,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.02273378940544759
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7813504823151125,
"acc_stderr": 0.02347558141786111,
"acc_norm": 0.7813504823151125,
"acc_norm_stderr": 0.02347558141786111
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.021038517770157365,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.021038517770157365
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5354609929078015,
"acc_stderr": 0.029752389657427054,
"acc_norm": 0.5354609929078015,
"acc_norm_stderr": 0.029752389657427054
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.529986962190352,
"acc_stderr": 0.012747248967079045,
"acc_norm": 0.529986962190352,
"acc_norm_stderr": 0.012747248967079045
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7977941176470589,
"acc_stderr": 0.024398192986654924,
"acc_norm": 0.7977941176470589,
"acc_norm_stderr": 0.024398192986654924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.01699272346546623,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.01699272346546623
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101713,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101713
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40269277845777235,
"mc1_stderr": 0.017168830935187222,
"mc2": 0.553071814559571,
"mc2_stderr": 0.0151346546936277
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047986
},
"harness|gsm8k|5": {
"acc": 0.5928733889310084,
"acc_stderr": 0.013532811069356528
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_YeungNLP__firefly-mixtral-8x7b-v0.1 | [
"region:us"
] | 2023-12-23T21:20:11+00:00 | {"pretty_name": "Evaluation run of YeungNLP/firefly-mixtral-8x7b-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [YeungNLP/firefly-mixtral-8x7b-v0.1](https://huggingface.co/YeungNLP/firefly-mixtral-8x7b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YeungNLP__firefly-mixtral-8x7b-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T21:17:56.230718](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-mixtral-8x7b-v0.1/blob/main/results_2023-12-23T21-17-56.230718.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7128522987523449,\n \"acc_stderr\": 0.030245263140979715,\n \"acc_norm\": 0.7167785241964734,\n \"acc_norm_stderr\": 0.03083288189023405,\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.017168830935187222,\n \"mc2\": 0.553071814559571,\n \"mc2_stderr\": 0.0151346546936277\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6527303754266212,\n \"acc_stderr\": 0.013913034529620448,\n \"acc_norm\": 0.6808873720136519,\n \"acc_norm_stderr\": 0.013621696119173306\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6661023700458076,\n \"acc_stderr\": 0.0047063982523824635,\n \"acc_norm\": 0.8575980880302728,\n \"acc_norm_stderr\": 0.0034874768122805247\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n \"acc_stderr\": 0.04024778401977109,\n \"acc_norm\": 0.6814814814814815,\n \"acc_norm_stderr\": 0.04024778401977109\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.031975658210325,\n \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.031975658210325\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7849056603773585,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.7849056603773585,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8541666666666666,\n \"acc_stderr\": 0.029514245964291766,\n \"acc_norm\": 0.8541666666666666,\n \"acc_norm_stderr\": 0.029514245964291766\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.03476599607516478,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.03476599607516478\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6808510638297872,\n \"acc_stderr\": 0.030472973363380042,\n \"acc_norm\": 0.6808510638297872,\n \"acc_norm_stderr\": 0.030472973363380042\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6403508771929824,\n \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.6403508771929824,\n \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6689655172413793,\n \"acc_stderr\": 0.039215453124671215,\n \"acc_norm\": 0.6689655172413793,\n \"acc_norm_stderr\": 0.039215453124671215\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.48677248677248675,\n \"acc_stderr\": 0.025742297289575142,\n \"acc_norm\": 0.48677248677248675,\n \"acc_norm_stderr\": 0.025742297289575142\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n \"acc_stderr\": 0.02188617856717253,\n \"acc_norm\": 0.8193548387096774,\n \"acc_norm_stderr\": 0.02188617856717253\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6403940886699507,\n \"acc_stderr\": 0.03376458246509567,\n \"acc_norm\": 0.6403940886699507,\n \"acc_norm_stderr\": 0.03376458246509567\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880232,\n \"acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880232\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7076923076923077,\n \"acc_stderr\": 0.023060438380857744,\n \"acc_norm\": 0.7076923076923077,\n \"acc_norm_stderr\": 0.023060438380857744\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3925925925925926,\n \"acc_stderr\": 0.029773847012532967,\n \"acc_norm\": 0.3925925925925926,\n \"acc_norm_stderr\": 0.029773847012532967\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.026265024608275886,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.026265024608275886\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.48344370860927155,\n \"acc_stderr\": 0.040802441856289694,\n \"acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.040802441856289694\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8844036697247707,\n \"acc_stderr\": 0.01370874953417264,\n \"acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.01370874953417264\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.03275773486100999,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.03275773486100999\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801588,\n \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801588\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8396624472573839,\n \"acc_stderr\": 0.02388438092596567,\n \"acc_norm\": 0.8396624472573839,\n \"acc_norm_stderr\": 0.02388438092596567\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7623318385650224,\n \"acc_stderr\": 0.02856807946471429,\n \"acc_norm\": 0.7623318385650224,\n \"acc_norm_stderr\": 0.02856807946471429\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.03278548537343138,\n \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.03278548537343138\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243631002,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243631002\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n \"acc_stderr\": 0.01700436856813236,\n \"acc_norm\": 0.9273504273504274,\n \"acc_norm_stderr\": 0.01700436856813236\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8837803320561941,\n \"acc_stderr\": 0.011460632981922894,\n \"acc_norm\": 0.8837803320561941,\n \"acc_norm_stderr\": 0.011460632981922894\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.021855255263421795,\n \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.021855255263421795\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n \"acc_stderr\": 0.0165638293990477,\n \"acc_norm\": 0.4312849162011173,\n \"acc_norm_stderr\": 0.0165638293990477\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.02273378940544759,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.02273378940544759\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n \"acc_stderr\": 0.02347558141786111,\n \"acc_norm\": 0.7813504823151125,\n \"acc_norm_stderr\": 0.02347558141786111\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.021038517770157365,\n \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.021038517770157365\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5354609929078015,\n \"acc_stderr\": 0.029752389657427054,\n \"acc_norm\": 0.5354609929078015,\n \"acc_norm_stderr\": 0.029752389657427054\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.529986962190352,\n \"acc_stderr\": 0.012747248967079045,\n \"acc_norm\": 0.529986962190352,\n \"acc_norm_stderr\": 0.012747248967079045\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7977941176470589,\n \"acc_stderr\": 0.024398192986654924,\n \"acc_norm\": 0.7977941176470589,\n \"acc_norm_stderr\": 0.024398192986654924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7712418300653595,\n \"acc_stderr\": 0.01699272346546623,\n \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.01699272346546623\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866767,\n \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866767\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101713,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101713\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.017168830935187222,\n \"mc2\": 0.553071814559571,\n \"mc2_stderr\": 0.0151346546936277\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047986\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5928733889310084,\n \"acc_stderr\": 0.013532811069356528\n }\n}\n```", "repo_url": "https://huggingface.co/YeungNLP/firefly-mixtral-8x7b-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|arc:challenge|25_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|gsm8k|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hellaswag|10_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T21-17-56.230718.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["**/details_harness|winogrande|5_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T21-17-56.230718.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T21_17_56.230718", "path": ["results_2023-12-23T21-17-56.230718.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T21-17-56.230718.parquet"]}]}]} | 2023-12-23T21:20:33+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of YeungNLP/firefly-mixtral-8x7b-v0.1
Dataset automatically created during the evaluation run of model YeungNLP/firefly-mixtral-8x7b-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T21:17:56.230718(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of YeungNLP/firefly-mixtral-8x7b-v0.1\n\n\n\nDataset automatically created during the evaluation run of model YeungNLP/firefly-mixtral-8x7b-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T21:17:56.230718(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of YeungNLP/firefly-mixtral-8x7b-v0.1\n\n\n\nDataset automatically created during the evaluation run of model YeungNLP/firefly-mixtral-8x7b-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T21:17:56.230718(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of YeungNLP/firefly-mixtral-8x7b-v0.1\n\n\n\nDataset automatically created during the evaluation run of model YeungNLP/firefly-mixtral-8x7b-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T21:17:56.230718(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
6e11ce5b4c82a9b00e9e8e96ba50f497bf1d9f75 |
# Dataset Card for Evaluation run of Mihaiii/Pallas-0.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Mihaiii/Pallas-0.3](https://huggingface.co/Mihaiii/Pallas-0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mihaiii__Pallas-0.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-25T01:14:20.652633](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Pallas-0.3/blob/main/results_2023-12-25T01-14-20.652633.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7454130168383197,
"acc_stderr": 0.0290982917633922,
"acc_norm": 0.7502773723845314,
"acc_norm_stderr": 0.029647900326113162,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.5731028245386227,
"mc2_stderr": 0.015807029979791075
},
"harness|arc:challenge|25": {
"acc": 0.6194539249146758,
"acc_stderr": 0.014188277712349814,
"acc_norm": 0.6373720136518771,
"acc_norm_stderr": 0.01404910656495501
},
"harness|hellaswag|10": {
"acc": 0.6453893646683927,
"acc_stderr": 0.004774174590205144,
"acc_norm": 0.8330013941445927,
"acc_norm_stderr": 0.0037221237096104584
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8618421052631579,
"acc_stderr": 0.028081042939576552,
"acc_norm": 0.8618421052631579,
"acc_norm_stderr": 0.028081042939576552
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372277,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372277
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.0289198029561349,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.0289198029561349
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.774468085106383,
"acc_stderr": 0.027321078417387533,
"acc_norm": 0.774468085106383,
"acc_norm_stderr": 0.027321078417387533
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7034482758620689,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.7034482758620689,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.023919984164047736,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.023919984164047736
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.017308381281034516,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.017308381281034516
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6650246305418719,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.6650246305418719,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284332,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284332
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.020482086775424218,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.020482086775424218
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527041,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527041
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7923076923076923,
"acc_stderr": 0.020567539567246794,
"acc_norm": 0.7923076923076923,
"acc_norm_stderr": 0.020567539567246794
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.030242862397654002,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.030242862397654002
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8277310924369747,
"acc_stderr": 0.024528664971305424,
"acc_norm": 0.8277310924369747,
"acc_norm_stderr": 0.024528664971305424
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.908256880733945,
"acc_stderr": 0.012376323409137092,
"acc_norm": 0.908256880733945,
"acc_norm_stderr": 0.012376323409137092
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.032568505702936464,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.032568505702936464
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658925,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658925
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.019995560723758545,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.019995560723758545
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476074,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476074
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035206,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035206
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631001,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631001
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.026321383198783674,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.026321383198783674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.017004368568132366,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.017004368568132366
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9003831417624522,
"acc_stderr": 0.010709685591251671,
"acc_norm": 0.9003831417624522,
"acc_norm_stderr": 0.010709685591251671
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8121387283236994,
"acc_stderr": 0.02102926975242322,
"acc_norm": 0.8121387283236994,
"acc_norm_stderr": 0.02102926975242322
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6614525139664804,
"acc_stderr": 0.015826700096481353,
"acc_norm": 0.6614525139664804,
"acc_norm_stderr": 0.015826700096481353
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8071895424836601,
"acc_stderr": 0.022589318888176693,
"acc_norm": 0.8071895424836601,
"acc_norm_stderr": 0.022589318888176693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7813504823151125,
"acc_stderr": 0.023475581417861106,
"acc_norm": 0.7813504823151125,
"acc_norm_stderr": 0.023475581417861106
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.019766459563597252,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.019766459563597252
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6099290780141844,
"acc_stderr": 0.029097675599463933,
"acc_norm": 0.6099290780141844,
"acc_norm_stderr": 0.029097675599463933
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5801825293350718,
"acc_stderr": 0.012604960816087364,
"acc_norm": 0.5801825293350718,
"acc_norm_stderr": 0.012604960816087364
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8198529411764706,
"acc_stderr": 0.02334516361654484,
"acc_norm": 0.8198529411764706,
"acc_norm_stderr": 0.02334516361654484
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.016500472979024794,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.016500472979024794
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8285714285714286,
"acc_stderr": 0.02412746346265016,
"acc_norm": 0.8285714285714286,
"acc_norm_stderr": 0.02412746346265016
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.031446603773522035,
"acc_norm": 0.89,
"acc_norm_stderr": 0.031446603773522035
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.03878626771002361,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.03878626771002361
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.5731028245386227,
"mc2_stderr": 0.015807029979791075
},
"harness|winogrande|5": {
"acc": 0.8066298342541437,
"acc_stderr": 0.011099796645920522
},
"harness|gsm8k|5": {
"acc": 0.6027293404094011,
"acc_stderr": 0.013478659652337799
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Mihaiii__Pallas-0.3 | [
"region:us"
] | 2023-12-23T21:27:55+00:00 | {"pretty_name": "Evaluation run of Mihaiii/Pallas-0.3", "dataset_summary": "Dataset automatically created during the evaluation run of model [Mihaiii/Pallas-0.3](https://huggingface.co/Mihaiii/Pallas-0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mihaiii__Pallas-0.3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-25T01:14:20.652633](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Pallas-0.3/blob/main/results_2023-12-25T01-14-20.652633.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7454130168383197,\n \"acc_stderr\": 0.0290982917633922,\n \"acc_norm\": 0.7502773723845314,\n \"acc_norm_stderr\": 0.029647900326113162,\n \"mc1\": 0.4283965728274174,\n \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.5731028245386227,\n \"mc2_stderr\": 0.015807029979791075\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6194539249146758,\n \"acc_stderr\": 0.014188277712349814,\n \"acc_norm\": 0.6373720136518771,\n \"acc_norm_stderr\": 0.01404910656495501\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6453893646683927,\n \"acc_stderr\": 0.004774174590205144,\n \"acc_norm\": 0.8330013941445927,\n \"acc_norm_stderr\": 0.0037221237096104584\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8618421052631579,\n \"acc_stderr\": 0.028081042939576552,\n \"acc_norm\": 0.8618421052631579,\n \"acc_norm_stderr\": 0.028081042939576552\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372277,\n \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372277\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.0289198029561349,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.0289198029561349\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.774468085106383,\n \"acc_stderr\": 0.027321078417387533,\n \"acc_norm\": 0.774468085106383,\n \"acc_norm_stderr\": 0.027321078417387533\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7034482758620689,\n \"acc_stderr\": 0.03806142687309993,\n \"acc_norm\": 0.7034482758620689,\n \"acc_norm_stderr\": 0.03806142687309993\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.023919984164047736,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.023919984164047736\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.896774193548387,\n \"acc_stderr\": 0.017308381281034516,\n \"acc_norm\": 0.896774193548387,\n \"acc_norm_stderr\": 0.017308381281034516\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6650246305418719,\n \"acc_stderr\": 0.033208527423483104,\n \"acc_norm\": 0.6650246305418719,\n \"acc_norm_stderr\": 0.033208527423483104\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284332,\n \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284332\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.020482086775424218,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.020482086775424218\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527041,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527041\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7923076923076923,\n \"acc_stderr\": 0.020567539567246794,\n \"acc_norm\": 0.7923076923076923,\n \"acc_norm_stderr\": 0.020567539567246794\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.030242862397654002,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.030242862397654002\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8277310924369747,\n \"acc_stderr\": 0.024528664971305424,\n \"acc_norm\": 0.8277310924369747,\n \"acc_norm_stderr\": 0.024528664971305424\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.908256880733945,\n \"acc_stderr\": 0.012376323409137092,\n \"acc_norm\": 0.908256880733945,\n \"acc_norm_stderr\": 0.012376323409137092\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.032568505702936464,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.032568505702936464\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658925,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658925\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8945147679324894,\n \"acc_stderr\": 0.019995560723758545,\n \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.019995560723758545\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476074,\n \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476074\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035206,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035206\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243631001,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243631001\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.026321383198783674,\n \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.026321383198783674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n \"acc_stderr\": 0.017004368568132366,\n \"acc_norm\": 0.9273504273504274,\n \"acc_norm_stderr\": 0.017004368568132366\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9003831417624522,\n \"acc_stderr\": 0.010709685591251671,\n \"acc_norm\": 0.9003831417624522,\n \"acc_norm_stderr\": 0.010709685591251671\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8121387283236994,\n \"acc_stderr\": 0.02102926975242322,\n \"acc_norm\": 0.8121387283236994,\n \"acc_norm_stderr\": 0.02102926975242322\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6614525139664804,\n \"acc_stderr\": 0.015826700096481353,\n \"acc_norm\": 0.6614525139664804,\n \"acc_norm_stderr\": 0.015826700096481353\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8071895424836601,\n \"acc_stderr\": 0.022589318888176693,\n \"acc_norm\": 0.8071895424836601,\n \"acc_norm_stderr\": 0.022589318888176693\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n \"acc_stderr\": 0.023475581417861106,\n \"acc_norm\": 0.7813504823151125,\n \"acc_norm_stderr\": 0.023475581417861106\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.019766459563597252,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.019766459563597252\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6099290780141844,\n \"acc_stderr\": 0.029097675599463933,\n \"acc_norm\": 0.6099290780141844,\n \"acc_norm_stderr\": 0.029097675599463933\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5801825293350718,\n \"acc_stderr\": 0.012604960816087364,\n \"acc_norm\": 0.5801825293350718,\n \"acc_norm_stderr\": 0.012604960816087364\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8198529411764706,\n \"acc_stderr\": 0.02334516361654484,\n \"acc_norm\": 0.8198529411764706,\n \"acc_norm_stderr\": 0.02334516361654484\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.016500472979024794,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.016500472979024794\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.02412746346265016,\n \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.02412746346265016\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.031446603773522035,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.031446603773522035\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.03878626771002361,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.03878626771002361\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.5731028245386227,\n \"mc2_stderr\": 0.015807029979791075\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8066298342541437,\n \"acc_stderr\": 0.011099796645920522\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6027293404094011,\n \"acc_stderr\": 0.013478659652337799\n }\n}\n```", "repo_url": "https://huggingface.co/Mihaiii/Pallas-0.3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|arc:challenge|25_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|arc:challenge|25_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|gsm8k|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|gsm8k|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hellaswag|10_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hellaswag|10_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T21-25-41.795563.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-25T01-14-20.652633.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["**/details_harness|winogrande|5_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["**/details_harness|winogrande|5_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-25T01-14-20.652633.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T21_25_41.795563", "path": ["results_2023-12-23T21-25-41.795563.parquet"]}, {"split": "2023_12_25T01_14_20.652633", "path": ["results_2023-12-25T01-14-20.652633.parquet"]}, {"split": "latest", "path": ["results_2023-12-25T01-14-20.652633.parquet"]}]}]} | 2023-12-25T01:16:53+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Mihaiii/Pallas-0.3
Dataset automatically created during the evaluation run of model Mihaiii/Pallas-0.3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-25T01:14:20.652633(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Mihaiii/Pallas-0.3\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Pallas-0.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-25T01:14:20.652633(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Mihaiii/Pallas-0.3\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Pallas-0.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-25T01:14:20.652633(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
175,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Mihaiii/Pallas-0.3\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Pallas-0.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-25T01:14:20.652633(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
da4a8316c606800fd5468a1a057d26edd43d968b |
# Dataset Card for Evaluation run of Mihaiii/Pallas-0.4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Mihaiii/Pallas-0.4](https://huggingface.co/Mihaiii/Pallas-0.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mihaiii__Pallas-0.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T22:40:47.293518](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Pallas-0.4/blob/main/results_2023-12-24T22-40-47.293518.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7456868749897599,
"acc_stderr": 0.0291121349888231,
"acc_norm": 0.7505483434369399,
"acc_norm_stderr": 0.029662136004319276,
"mc1": 0.42717258261933905,
"mc1_stderr": 0.017316834410963933,
"mc2": 0.5729286090488297,
"mc2_stderr": 0.015803191112374947
},
"harness|arc:challenge|25": {
"acc": 0.6203071672354948,
"acc_stderr": 0.014182119866974872,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068283
},
"harness|hellaswag|10": {
"acc": 0.6451902011551484,
"acc_stderr": 0.004774778180345196,
"acc_norm": 0.8330013941445927,
"acc_norm_stderr": 0.003722123709610458
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8618421052631579,
"acc_stderr": 0.028081042939576552,
"acc_norm": 0.8618421052631579,
"acc_norm_stderr": 0.028081042939576552
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8037735849056604,
"acc_stderr": 0.024442388131100824,
"acc_norm": 0.8037735849056604,
"acc_norm_stderr": 0.024442388131100824
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.0289198029561349,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.0289198029561349
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.04913595201274504,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.04913595201274504
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7702127659574468,
"acc_stderr": 0.02750175294441242,
"acc_norm": 0.7702127659574468,
"acc_norm_stderr": 0.02750175294441242
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7034482758620689,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.7034482758620689,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.023919984164047736,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.023919984164047736
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.017308381281034516,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.017308381281034516
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6748768472906403,
"acc_stderr": 0.032957975663112704,
"acc_norm": 0.6748768472906403,
"acc_norm_stderr": 0.032957975663112704
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284332,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284332
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.020482086775424218,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.020482086775424218
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527041,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527041
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7897435897435897,
"acc_stderr": 0.02066059748502692,
"acc_norm": 0.7897435897435897,
"acc_norm_stderr": 0.02066059748502692
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.030242862397654002,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.030242862397654002
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8319327731092437,
"acc_stderr": 0.02428910211569229,
"acc_norm": 0.8319327731092437,
"acc_norm_stderr": 0.02428910211569229
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.0408024418562897,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.0408024418562897
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.908256880733945,
"acc_stderr": 0.012376323409137092,
"acc_norm": 0.908256880733945,
"acc_norm_stderr": 0.012376323409137092
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6435185185185185,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.6435185185185185,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658925,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658925
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.019995560723758545,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.019995560723758545
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476074,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476074
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035206,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035206
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631001,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631001
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.026845765054553855,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.026845765054553855
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.017004368568132366,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.017004368568132366
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9003831417624522,
"acc_stderr": 0.010709685591251671,
"acc_norm": 0.9003831417624522,
"acc_norm_stderr": 0.010709685591251671
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8121387283236994,
"acc_stderr": 0.02102926975242322,
"acc_norm": 0.8121387283236994,
"acc_norm_stderr": 0.02102926975242322
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.659217877094972,
"acc_stderr": 0.015852002449862106,
"acc_norm": 0.659217877094972,
"acc_norm_stderr": 0.015852002449862106
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8071895424836601,
"acc_stderr": 0.022589318888176693,
"acc_norm": 0.8071895424836601,
"acc_norm_stderr": 0.022589318888176693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7813504823151125,
"acc_stderr": 0.023475581417861106,
"acc_norm": 0.7813504823151125,
"acc_norm_stderr": 0.023475581417861106
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.019766459563597252,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.019766459563597252
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6099290780141844,
"acc_stderr": 0.029097675599463933,
"acc_norm": 0.6099290780141844,
"acc_norm_stderr": 0.029097675599463933
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5814863102998696,
"acc_stderr": 0.01259950560833648,
"acc_norm": 0.5814863102998696,
"acc_norm_stderr": 0.01259950560833648
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8198529411764706,
"acc_stderr": 0.02334516361654484,
"acc_norm": 0.8198529411764706,
"acc_norm_stderr": 0.02334516361654484
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7908496732026143,
"acc_stderr": 0.016453399332279326,
"acc_norm": 0.7908496732026143,
"acc_norm_stderr": 0.016453399332279326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8285714285714286,
"acc_stderr": 0.02412746346265016,
"acc_norm": 0.8285714285714286,
"acc_norm_stderr": 0.02412746346265016
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.031446603773522035,
"acc_norm": 0.89,
"acc_norm_stderr": 0.031446603773522035
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.03878626771002361,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.03878626771002361
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42717258261933905,
"mc1_stderr": 0.017316834410963933,
"mc2": 0.5729286090488297,
"mc2_stderr": 0.015803191112374947
},
"harness|winogrande|5": {
"acc": 0.8058405682715075,
"acc_stderr": 0.011116983392392659
},
"harness|gsm8k|5": {
"acc": 0.6027293404094011,
"acc_stderr": 0.013478659652337799
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Mihaiii__Pallas-0.4 | [
"region:us"
] | 2023-12-23T21:27:58+00:00 | {"pretty_name": "Evaluation run of Mihaiii/Pallas-0.4", "dataset_summary": "Dataset automatically created during the evaluation run of model [Mihaiii/Pallas-0.4](https://huggingface.co/Mihaiii/Pallas-0.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mihaiii__Pallas-0.4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-24T22:40:47.293518](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Pallas-0.4/blob/main/results_2023-12-24T22-40-47.293518.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7456868749897599,\n \"acc_stderr\": 0.0291121349888231,\n \"acc_norm\": 0.7505483434369399,\n \"acc_norm_stderr\": 0.029662136004319276,\n \"mc1\": 0.42717258261933905,\n \"mc1_stderr\": 0.017316834410963933,\n \"mc2\": 0.5729286090488297,\n \"mc2_stderr\": 0.015803191112374947\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6203071672354948,\n \"acc_stderr\": 0.014182119866974872,\n \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068283\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6451902011551484,\n \"acc_stderr\": 0.004774778180345196,\n \"acc_norm\": 0.8330013941445927,\n \"acc_norm_stderr\": 0.003722123709610458\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8618421052631579,\n \"acc_stderr\": 0.028081042939576552,\n \"acc_norm\": 0.8618421052631579,\n \"acc_norm_stderr\": 0.028081042939576552\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8037735849056604,\n \"acc_stderr\": 0.024442388131100824,\n \"acc_norm\": 0.8037735849056604,\n \"acc_norm_stderr\": 0.024442388131100824\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.0289198029561349,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.0289198029561349\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.04913595201274504,\n \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.04913595201274504\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7702127659574468,\n \"acc_stderr\": 0.02750175294441242,\n \"acc_norm\": 0.7702127659574468,\n \"acc_norm_stderr\": 0.02750175294441242\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7034482758620689,\n \"acc_stderr\": 0.03806142687309993,\n \"acc_norm\": 0.7034482758620689,\n \"acc_norm_stderr\": 0.03806142687309993\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.023919984164047736,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.023919984164047736\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.896774193548387,\n \"acc_stderr\": 0.017308381281034516,\n \"acc_norm\": 0.896774193548387,\n \"acc_norm_stderr\": 0.017308381281034516\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6748768472906403,\n \"acc_stderr\": 0.032957975663112704,\n \"acc_norm\": 0.6748768472906403,\n \"acc_norm_stderr\": 0.032957975663112704\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284332,\n \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284332\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.020482086775424218,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.020482086775424218\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527041,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527041\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7897435897435897,\n \"acc_stderr\": 0.02066059748502692,\n \"acc_norm\": 0.7897435897435897,\n \"acc_norm_stderr\": 0.02066059748502692\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.030242862397654002,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.030242862397654002\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8319327731092437,\n \"acc_stderr\": 0.02428910211569229,\n \"acc_norm\": 0.8319327731092437,\n \"acc_norm_stderr\": 0.02428910211569229\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.48344370860927155,\n \"acc_stderr\": 0.0408024418562897,\n \"acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.0408024418562897\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.908256880733945,\n \"acc_stderr\": 0.012376323409137092,\n \"acc_norm\": 0.908256880733945,\n \"acc_norm_stderr\": 0.012376323409137092\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6435185185185185,\n \"acc_stderr\": 0.032664783315272714,\n \"acc_norm\": 0.6435185185185185,\n \"acc_norm_stderr\": 0.032664783315272714\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658925,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658925\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8945147679324894,\n \"acc_stderr\": 0.019995560723758545,\n \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.019995560723758545\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476074,\n \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476074\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035206,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035206\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243631001,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243631001\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553855,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553855\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n \"acc_stderr\": 0.017004368568132366,\n \"acc_norm\": 0.9273504273504274,\n \"acc_norm_stderr\": 0.017004368568132366\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9003831417624522,\n \"acc_stderr\": 0.010709685591251671,\n \"acc_norm\": 0.9003831417624522,\n \"acc_norm_stderr\": 0.010709685591251671\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8121387283236994,\n \"acc_stderr\": 0.02102926975242322,\n \"acc_norm\": 0.8121387283236994,\n \"acc_norm_stderr\": 0.02102926975242322\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.659217877094972,\n \"acc_stderr\": 0.015852002449862106,\n \"acc_norm\": 0.659217877094972,\n \"acc_norm_stderr\": 0.015852002449862106\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8071895424836601,\n \"acc_stderr\": 0.022589318888176693,\n \"acc_norm\": 0.8071895424836601,\n \"acc_norm_stderr\": 0.022589318888176693\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n \"acc_stderr\": 0.023475581417861106,\n \"acc_norm\": 0.7813504823151125,\n \"acc_norm_stderr\": 0.023475581417861106\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.019766459563597252,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.019766459563597252\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6099290780141844,\n \"acc_stderr\": 0.029097675599463933,\n \"acc_norm\": 0.6099290780141844,\n \"acc_norm_stderr\": 0.029097675599463933\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5814863102998696,\n \"acc_stderr\": 0.01259950560833648,\n \"acc_norm\": 0.5814863102998696,\n \"acc_norm_stderr\": 0.01259950560833648\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8198529411764706,\n \"acc_stderr\": 0.02334516361654484,\n \"acc_norm\": 0.8198529411764706,\n \"acc_norm_stderr\": 0.02334516361654484\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7908496732026143,\n \"acc_stderr\": 0.016453399332279326,\n \"acc_norm\": 0.7908496732026143,\n \"acc_norm_stderr\": 0.016453399332279326\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.02412746346265016,\n \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.02412746346265016\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.031446603773522035,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.031446603773522035\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.03878626771002361,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.03878626771002361\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42717258261933905,\n \"mc1_stderr\": 0.017316834410963933,\n \"mc2\": 0.5729286090488297,\n \"mc2_stderr\": 0.015803191112374947\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8058405682715075,\n \"acc_stderr\": 0.011116983392392659\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6027293404094011,\n \"acc_stderr\": 0.013478659652337799\n }\n}\n```", "repo_url": "https://huggingface.co/Mihaiii/Pallas-0.4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|arc:challenge|25_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|arc:challenge|25_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|gsm8k|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|gsm8k|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hellaswag|10_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hellaswag|10_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T21-25-47.978596.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T22-40-47.293518.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["**/details_harness|winogrande|5_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["**/details_harness|winogrande|5_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-24T22-40-47.293518.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T21_25_47.978596", "path": ["results_2023-12-23T21-25-47.978596.parquet"]}, {"split": "2023_12_24T22_40_47.293518", "path": ["results_2023-12-24T22-40-47.293518.parquet"]}, {"split": "latest", "path": ["results_2023-12-24T22-40-47.293518.parquet"]}]}]} | 2023-12-24T22:43:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Mihaiii/Pallas-0.4
Dataset automatically created during the evaluation run of model Mihaiii/Pallas-0.4 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-24T22:40:47.293518(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Mihaiii/Pallas-0.4\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Pallas-0.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T22:40:47.293518(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Mihaiii/Pallas-0.4\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Pallas-0.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T22:40:47.293518(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
175,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Mihaiii/Pallas-0.4\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Pallas-0.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-24T22:40:47.293518(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
099af9a67288e7a90fd8442e1bab6cc74ccda63e |
# peS2o: 100k 'xlong' sample
A sample of 100k docs from `allenai/peS2o`:
- all docs filtered to be more than 4096 and less than 34,000 GPT-4 tiktoken tokens | BEE-spoke-data/peS2o-100k_en-xlong | [
"task_categories:text-generation",
"size_categories:10K<n<100K",
"source_datasets:allenai/peS2o",
"license:odc-by",
"region:us"
] | 2023-12-23T22:10:15+00:00 | {"license": "odc-by", "size_categories": ["10K<n<100K"], "source_datasets": "allenai/peS2o", "task_categories": ["text-generation"], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3512277106, "num_examples": 100000}], "download_size": 1748605733, "dataset_size": 3512277106}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-24T04:14:02+00:00 | [] | [] | TAGS
#task_categories-text-generation #size_categories-10K<n<100K #source_datasets-allenai/peS2o #license-odc-by #region-us
|
# peS2o: 100k 'xlong' sample
A sample of 100k docs from 'allenai/peS2o':
- all docs filtered to be more than 4096 and less than 34,000 GPT-4 tiktoken tokens | [
"# peS2o: 100k 'xlong' sample\n\n\nA sample of 100k docs from 'allenai/peS2o':\n\n- all docs filtered to be more than 4096 and less than 34,000 GPT-4 tiktoken tokens"
] | [
"TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #source_datasets-allenai/peS2o #license-odc-by #region-us \n",
"# peS2o: 100k 'xlong' sample\n\n\nA sample of 100k docs from 'allenai/peS2o':\n\n- all docs filtered to be more than 4096 and less than 34,000 GPT-4 tiktoken tokens"
] | [
51,
56
] | [
"passage: TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #source_datasets-allenai/peS2o #license-odc-by #region-us \n# peS2o: 100k 'xlong' sample\n\n\nA sample of 100k docs from 'allenai/peS2o':\n\n- all docs filtered to be more than 4096 and less than 34,000 GPT-4 tiktoken tokens"
] |
fe12fc5f1b7606543b0355eda392f1ddc54625c6 | ## Description
RESISC45 dataset is a publicly available benchmark for Remote Sensing Image Scene Classification (RESISC), created by Northwestern Polytechnical University (NWPU). This dataset contains 31,500 images, covering 45 scene classes with 700 images in each class.
The dataset does not have any default splits. Train, validation, and test splits were based on these definitions here https://github.com/google-research/google-research/blob/master/remote_sensing_representations/README.md#dataset-splits
- Paper: https://arxiv.org/abs/1703.00121.
- Website: https://paperswithcode.com/dataset/resisc45 (original homepage is unresponsive http://www.escience.cn/people/JunweiHan/NWPU-RESISC45.html)
## Citation
```bibtex
@article{Cheng_2017,
title={Remote Sensing Image Scene Classification: Benchmark and State of the Art},
volume={105},
ISSN={1558-2256},
url={http://dx.doi.org/10.1109/JPROC.2017.2675998},
DOI={10.1109/jproc.2017.2675998},
number={10},
journal={Proceedings of the IEEE},
publisher={Institute of Electrical and Electronics Engineers (IEEE)},
author={Cheng, Gong and Han, Junwei and Lu, Xiaoqiang},
year={2017},
month={Oct},
pages={1865-1883}
}
``` | timm/resisc45 | [
"task_categories:image-classification",
"size_categories:10K<n<100K",
"license:unknown",
"arxiv:1703.00121",
"region:us"
] | 2023-12-23T22:40:50+00:00 | {"license": "unknown", "size_categories": ["10K<n<100K"], "task_categories": ["image-classification"], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "airplane", "1": "airport", "2": "baseball_diamond", "3": "basketball_court", "4": "beach", "5": "bridge", "6": "chaparral", "7": "church", "8": "circular_farmland", "9": "cloud", "10": "commercial_area", "11": "dense_residential", "12": "desert", "13": "forest", "14": "freeway", "15": "golf_course", "16": "ground_track_field", "17": "harbor", "18": "industrial_area", "19": "intersection", "20": "island", "21": "lake", "22": "meadow", "23": "medium_residential", "24": "mobile_home_park", "25": "mountain", "26": "overpass", "27": "palace", "28": "parking_lot", "29": "railway", "30": "railway_station", "31": "rectangular_farmland", "32": "river", "33": "roundabout", "34": "runway", "35": "sea_ice", "36": "ship", "37": "snowberg", "38": "sparse_residential", "39": "stadium", "40": "storage_tank", "41": "tennis_court", "42": "terrace", "43": "thermal_power_station", "44": "wetland"}}}}, {"name": "image_id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 254594749.8, "num_examples": 18900}, {"name": "validation", "num_bytes": 84784207.3, "num_examples": 6300}, {"name": "test", "num_bytes": 85237234, "num_examples": 6300}], "download_size": 425667137, "dataset_size": 424616191.1}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-07T18:11:08+00:00 | [
"1703.00121"
] | [] | TAGS
#task_categories-image-classification #size_categories-10K<n<100K #license-unknown #arxiv-1703.00121 #region-us
| ## Description
RESISC45 dataset is a publicly available benchmark for Remote Sensing Image Scene Classification (RESISC), created by Northwestern Polytechnical University (NWPU). This dataset contains 31,500 images, covering 45 scene classes with 700 images in each class.
The dataset does not have any default splits. Train, validation, and test splits were based on these definitions here URL
- Paper: URL
- Website: URL (original homepage is unresponsive URL
| [
"## Description\nRESISC45 dataset is a publicly available benchmark for Remote Sensing Image Scene Classification (RESISC), created by Northwestern Polytechnical University (NWPU). This dataset contains 31,500 images, covering 45 scene classes with 700 images in each class.\n\nThe dataset does not have any default splits. Train, validation, and test splits were based on these definitions here URL\n\n- Paper: URL\n- Website: URL (original homepage is unresponsive URL"
] | [
"TAGS\n#task_categories-image-classification #size_categories-10K<n<100K #license-unknown #arxiv-1703.00121 #region-us \n",
"## Description\nRESISC45 dataset is a publicly available benchmark for Remote Sensing Image Scene Classification (RESISC), created by Northwestern Polytechnical University (NWPU). This dataset contains 31,500 images, covering 45 scene classes with 700 images in each class.\n\nThe dataset does not have any default splits. Train, validation, and test splits were based on these definitions here URL\n\n- Paper: URL\n- Website: URL (original homepage is unresponsive URL"
] | [
44,
106
] | [
"passage: TAGS\n#task_categories-image-classification #size_categories-10K<n<100K #license-unknown #arxiv-1703.00121 #region-us \n## Description\nRESISC45 dataset is a publicly available benchmark for Remote Sensing Image Scene Classification (RESISC), created by Northwestern Polytechnical University (NWPU). This dataset contains 31,500 images, covering 45 scene classes with 700 images in each class.\n\nThe dataset does not have any default splits. Train, validation, and test splits were based on these definitions here URL\n\n- Paper: URL\n- Website: URL (original homepage is unresponsive URL"
] |
51188e0c02ac77c8d7e849c8a4e4e6005955e938 |
# Dataset Card for Evaluation run of orangetin/OpenHermes-Mixtral-8x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [orangetin/OpenHermes-Mixtral-8x7B](https://huggingface.co/orangetin/OpenHermes-Mixtral-8x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_orangetin__OpenHermes-Mixtral-8x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T22:42:57.677534](https://huggingface.co/datasets/open-llm-leaderboard/details_orangetin__OpenHermes-Mixtral-8x7B/blob/main/results_2023-12-23T22-42-57.677534.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6410820826815717,
"acc_stderr": 0.03191534800972364,
"acc_norm": 0.6461749295540492,
"acc_norm_stderr": 0.03253322826774671,
"mc1": 0.4369645042839657,
"mc1_stderr": 0.01736384450319598,
"mc2": 0.5952583328395225,
"mc2_stderr": 0.015815707878895833
},
"harness|arc:challenge|25": {
"acc": 0.60580204778157,
"acc_stderr": 0.014280522667467327,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.01403476138617545
},
"harness|hellaswag|10": {
"acc": 0.658832901812388,
"acc_stderr": 0.004731324409133277,
"acc_norm": 0.8413662617008564,
"acc_norm_stderr": 0.0036458755686012835
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952929,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952929
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562427,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562427
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383887,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383887
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02510742548113728,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02510742548113728
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.02458002892148101,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.02458002892148101
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.034767257476490364,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.034767257476490364
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8282828282828283,
"acc_stderr": 0.026869716187429903,
"acc_norm": 0.8282828282828283,
"acc_norm_stderr": 0.026869716187429903
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6230769230769231,
"acc_stderr": 0.024570975364225995,
"acc_norm": 0.6230769230769231,
"acc_norm_stderr": 0.024570975364225995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547307,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547307
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612903,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612903
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025045,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8438818565400844,
"acc_stderr": 0.02362715946031868,
"acc_norm": 0.8438818565400844,
"acc_norm_stderr": 0.02362715946031868
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508766,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508766
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884866,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884866
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867447,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867447
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8531289910600255,
"acc_stderr": 0.012658201736147292,
"acc_norm": 0.8531289910600255,
"acc_norm_stderr": 0.012658201736147292
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258165,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258165
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37988826815642457,
"acc_stderr": 0.016232826818678502,
"acc_norm": 0.37988826815642457,
"acc_norm_stderr": 0.016232826818678502
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137904,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137904
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.012751977967676008,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.012751977967676008
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.018433427649401892,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.018433427649401892
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4369645042839657,
"mc1_stderr": 0.01736384450319598,
"mc2": 0.5952583328395225,
"mc2_stderr": 0.015815707878895833
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552667
},
"harness|gsm8k|5": {
"acc": 0.45716451857467777,
"acc_stderr": 0.013721849968709721
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_orangetin__OpenHermes-Mixtral-8x7B | [
"region:us"
] | 2023-12-23T22:45:15+00:00 | {"pretty_name": "Evaluation run of orangetin/OpenHermes-Mixtral-8x7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [orangetin/OpenHermes-Mixtral-8x7B](https://huggingface.co/orangetin/OpenHermes-Mixtral-8x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_orangetin__OpenHermes-Mixtral-8x7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T22:42:57.677534](https://huggingface.co/datasets/open-llm-leaderboard/details_orangetin__OpenHermes-Mixtral-8x7B/blob/main/results_2023-12-23T22-42-57.677534.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6410820826815717,\n \"acc_stderr\": 0.03191534800972364,\n \"acc_norm\": 0.6461749295540492,\n \"acc_norm_stderr\": 0.03253322826774671,\n \"mc1\": 0.4369645042839657,\n \"mc1_stderr\": 0.01736384450319598,\n \"mc2\": 0.5952583328395225,\n \"mc2_stderr\": 0.015815707878895833\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.60580204778157,\n \"acc_stderr\": 0.014280522667467327,\n \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.01403476138617545\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.658832901812388,\n \"acc_stderr\": 0.004731324409133277,\n \"acc_norm\": 0.8413662617008564,\n \"acc_norm_stderr\": 0.0036458755686012835\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952929,\n \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952929\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562427,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562427\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.02510742548113728,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02510742548113728\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n \"acc_stderr\": 0.02458002892148101,\n \"acc_norm\": 0.7516129032258064,\n \"acc_norm_stderr\": 0.02458002892148101\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.034767257476490364,\n \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.034767257476490364\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8282828282828283,\n \"acc_stderr\": 0.026869716187429903,\n \"acc_norm\": 0.8282828282828283,\n \"acc_norm_stderr\": 0.026869716187429903\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6230769230769231,\n \"acc_stderr\": 0.024570975364225995,\n \"acc_norm\": 0.6230769230769231,\n \"acc_norm_stderr\": 0.024570975364225995\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547307,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547307\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612903,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612903\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025045,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8438818565400844,\n \"acc_stderr\": 0.02362715946031868,\n \"acc_norm\": 0.8438818565400844,\n \"acc_norm_stderr\": 0.02362715946031868\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n \"acc_stderr\": 0.030216831011508766,\n \"acc_norm\": 0.7174887892376681,\n \"acc_norm_stderr\": 0.030216831011508766\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n \"acc_stderr\": 0.019875655027867447,\n \"acc_norm\": 0.8974358974358975,\n \"acc_norm_stderr\": 0.019875655027867447\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8531289910600255,\n \"acc_stderr\": 0.012658201736147292,\n \"acc_norm\": 0.8531289910600255,\n \"acc_norm_stderr\": 0.012658201736147292\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258165,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258165\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37988826815642457,\n \"acc_stderr\": 0.016232826818678502,\n \"acc_norm\": 0.37988826815642457,\n \"acc_norm_stderr\": 0.016232826818678502\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137904,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137904\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n \"acc_stderr\": 0.012751977967676008,\n \"acc_norm\": 0.47327249022164275,\n \"acc_norm_stderr\": 0.012751977967676008\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.018433427649401892,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.018433427649401892\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4369645042839657,\n \"mc1_stderr\": 0.01736384450319598,\n \"mc2\": 0.5952583328395225,\n \"mc2_stderr\": 0.015815707878895833\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45716451857467777,\n \"acc_stderr\": 0.013721849968709721\n }\n}\n```", "repo_url": "https://huggingface.co/orangetin/OpenHermes-Mixtral-8x7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|arc:challenge|25_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|gsm8k|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hellaswag|10_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T22-42-57.677534.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["**/details_harness|winogrande|5_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T22-42-57.677534.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T22_42_57.677534", "path": ["results_2023-12-23T22-42-57.677534.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T22-42-57.677534.parquet"]}]}]} | 2023-12-23T22:45:34+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of orangetin/OpenHermes-Mixtral-8x7B
Dataset automatically created during the evaluation run of model orangetin/OpenHermes-Mixtral-8x7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T22:42:57.677534(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of orangetin/OpenHermes-Mixtral-8x7B\n\n\n\nDataset automatically created during the evaluation run of model orangetin/OpenHermes-Mixtral-8x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T22:42:57.677534(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of orangetin/OpenHermes-Mixtral-8x7B\n\n\n\nDataset automatically created during the evaluation run of model orangetin/OpenHermes-Mixtral-8x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T22:42:57.677534(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of orangetin/OpenHermes-Mixtral-8x7B\n\n\n\nDataset automatically created during the evaluation run of model orangetin/OpenHermes-Mixtral-8x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T22:42:57.677534(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
b4e28552cd5f3932b6abc37eb20d3e84901ad728 |
# EuroSat (RGB)
## Description
A dataset based on Sentinel-2 satellite images covering 13 spectral bands and consisting of 10 classes with 27000 labeled and geo-referenced samples. This is the RGB version of the dataset with visible bands encoded as JPEG images.
The dataset does not have any default splits. Train, validation, and test splits were based on these definitions here https://github.com/google-research/google-research/blob/master/remote_sensing_representations/README.md#dataset-splits
* Website: https://github.com/phelber/eurosat
* Paper: https://arxiv.org/abs/1709.00029
## Citation
```bibtext
@article{helber2019eurosat,
title={Eurosat: A novel dataset and deep learning benchmark for land use and land cover classification},
author={Helber, Patrick and Bischke, Benjamin and Dengel, Andreas and Borth, Damian},
journal={IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing},
year={2019},
publisher={IEEE}
}
``` | timm/eurosat-rgb | [
"task_categories:image-classification",
"size_categories:10K<n<100K",
"license:mit",
"arxiv:1709.00029",
"region:us"
] | 2023-12-23T22:52:35+00:00 | {"license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["image-classification"], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "AnnualCrop", "1": "Forest", "2": "HerbaceousVegetation", "3": "Highway", "4": "Industrial", "5": "Pasture", "6": "PermanentCrop", "7": "Residential", "8": "River", "9": "SeaLake"}}}}, {"name": "image_id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 55332279, "num_examples": 16200}, {"name": "validation", "num_bytes": 18472972.2, "num_examples": 5400}, {"name": "test", "num_bytes": 18625106.4, "num_examples": 5400}], "download_size": 92078756, "dataset_size": 92430357.6}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-07T18:11:26+00:00 | [
"1709.00029"
] | [] | TAGS
#task_categories-image-classification #size_categories-10K<n<100K #license-mit #arxiv-1709.00029 #region-us
|
# EuroSat (RGB)
## Description
A dataset based on Sentinel-2 satellite images covering 13 spectral bands and consisting of 10 classes with 27000 labeled and geo-referenced samples. This is the RGB version of the dataset with visible bands encoded as JPEG images.
The dataset does not have any default splits. Train, validation, and test splits were based on these definitions here URL
* Website: URL
* Paper: URL
| [
"# EuroSat (RGB)",
"## Description\n\nA dataset based on Sentinel-2 satellite images covering 13 spectral bands and consisting of 10 classes with 27000 labeled and geo-referenced samples. This is the RGB version of the dataset with visible bands encoded as JPEG images.\n\nThe dataset does not have any default splits. Train, validation, and test splits were based on these definitions here URL\n\n* Website: URL\n* Paper: URL"
] | [
"TAGS\n#task_categories-image-classification #size_categories-10K<n<100K #license-mit #arxiv-1709.00029 #region-us \n",
"# EuroSat (RGB)",
"## Description\n\nA dataset based on Sentinel-2 satellite images covering 13 spectral bands and consisting of 10 classes with 27000 labeled and geo-referenced samples. This is the RGB version of the dataset with visible bands encoded as JPEG images.\n\nThe dataset does not have any default splits. Train, validation, and test splits were based on these definitions here URL\n\n* Website: URL\n* Paper: URL"
] | [
42,
8,
95
] | [
"passage: TAGS\n#task_categories-image-classification #size_categories-10K<n<100K #license-mit #arxiv-1709.00029 #region-us \n# EuroSat (RGB)## Description\n\nA dataset based on Sentinel-2 satellite images covering 13 spectral bands and consisting of 10 classes with 27000 labeled and geo-referenced samples. This is the RGB version of the dataset with visible bands encoded as JPEG images.\n\nThe dataset does not have any default splits. Train, validation, and test splits were based on these definitions here URL\n\n* Website: URL\n* Paper: URL"
] |
4c9cd41ab1682fa28739320ba33444f7fff85d8f |
# Pseudolabel Malaysian Youtube videos using Whisper Large V3
Original dataset at https://huggingface.co/datasets/malaysia-ai/crawl-youtube, distributed pseudolabelled using 4x A100s
script at https://github.com/mesolitica/malaysian-dataset/tree/master/speech-to-text-semisupervised/pseudolabel-whisper
1. Each audio is 30 seconds.
2. Each audio saved in 16k sample rate. | mesolitica/pseudolabel-malaysian-youtube-whisper-large-v3 | [
"task_categories:automatic-speech-recognition",
"language:ms",
"region:us"
] | 2023-12-23T23:07:43+00:00 | {"language": ["ms"], "task_categories": ["automatic-speech-recognition"]} | 2024-01-01T04:17:26+00:00 | [] | [
"ms"
] | TAGS
#task_categories-automatic-speech-recognition #language-Malay (macrolanguage) #region-us
|
# Pseudolabel Malaysian Youtube videos using Whisper Large V3
Original dataset at URL distributed pseudolabelled using 4x A100s
script at URL
1. Each audio is 30 seconds.
2. Each audio saved in 16k sample rate. | [
"# Pseudolabel Malaysian Youtube videos using Whisper Large V3\n\nOriginal dataset at URL distributed pseudolabelled using 4x A100s\n\nscript at URL\n\n1. Each audio is 30 seconds.\n2. Each audio saved in 16k sample rate."
] | [
"TAGS\n#task_categories-automatic-speech-recognition #language-Malay (macrolanguage) #region-us \n",
"# Pseudolabel Malaysian Youtube videos using Whisper Large V3\n\nOriginal dataset at URL distributed pseudolabelled using 4x A100s\n\nscript at URL\n\n1. Each audio is 30 seconds.\n2. Each audio saved in 16k sample rate."
] | [
32,
53
] | [
"passage: TAGS\n#task_categories-automatic-speech-recognition #language-Malay (macrolanguage) #region-us \n# Pseudolabel Malaysian Youtube videos using Whisper Large V3\n\nOriginal dataset at URL distributed pseudolabelled using 4x A100s\n\nscript at URL\n\n1. Each audio is 30 seconds.\n2. Each audio saved in 16k sample rate."
] |
66a03db98e864b26b438bc620cbd3428c66b081e |
# Dataset Card for Evaluation run of VAGOsolutions/SauerkrautLM-Mixtral-8x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [VAGOsolutions/SauerkrautLM-Mixtral-8x7B](https://huggingface.co/VAGOsolutions/SauerkrautLM-Mixtral-8x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-Mixtral-8x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T23:09:05.690555](https://huggingface.co/datasets/open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-Mixtral-8x7B/blob/main/results_2023-12-23T23-09-05.690555.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6660391753947925,
"acc_stderr": 0.03158802076488222,
"acc_norm": 0.6705871411561132,
"acc_norm_stderr": 0.03221037708587965,
"mc1": 0.40758873929008566,
"mc1_stderr": 0.017201949234553104,
"mc2": 0.5719887946496497,
"mc2_stderr": 0.01564526507191298
},
"harness|arc:challenge|25": {
"acc": 0.6544368600682594,
"acc_stderr": 0.013896938461145678,
"acc_norm": 0.6885665529010239,
"acc_norm_stderr": 0.013532472099850935
},
"harness|hellaswag|10": {
"acc": 0.6775542720573591,
"acc_stderr": 0.004664572784985589,
"acc_norm": 0.8600876319458275,
"acc_norm_stderr": 0.0034618713240671967
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7433962264150943,
"acc_stderr": 0.02688064788905199,
"acc_norm": 0.7433962264150943,
"acc_norm_stderr": 0.02688064788905199
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.046774730044912,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.046774730044912
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4576719576719577,
"acc_stderr": 0.025658868862058322,
"acc_norm": 0.4576719576719577,
"acc_norm_stderr": 0.025658868862058322
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568532,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568532
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8232323232323232,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.8232323232323232,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971118,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971118
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815642,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815642
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7226890756302521,
"acc_stderr": 0.029079374539480007,
"acc_norm": 0.7226890756302521,
"acc_norm_stderr": 0.029079374539480007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.03995524007681681,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.03995524007681681
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8605504587155963,
"acc_stderr": 0.014852421490033055,
"acc_norm": 0.8605504587155963,
"acc_norm_stderr": 0.014852421490033055
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.0340763209385405,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.0340763209385405
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8438818565400844,
"acc_stderr": 0.023627159460318674,
"acc_norm": 0.8438818565400844,
"acc_norm_stderr": 0.023627159460318674
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7443946188340808,
"acc_stderr": 0.029275891003969923,
"acc_norm": 0.7443946188340808,
"acc_norm_stderr": 0.029275891003969923
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.033907806129727755,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.033907806129727755
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867447,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867447
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.855683269476373,
"acc_stderr": 0.012566417503320946,
"acc_norm": 0.855683269476373,
"acc_norm_stderr": 0.012566417503320946
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39217877094972065,
"acc_stderr": 0.016329061073207442,
"acc_norm": 0.39217877094972065,
"acc_norm_stderr": 0.016329061073207442
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.022899162918445803,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.022899162918445803
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48370273794002605,
"acc_stderr": 0.01276345073469982,
"acc_norm": 0.48370273794002605,
"acc_norm_stderr": 0.01276345073469982
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.018342529845275908,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.018342529845275908
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40758873929008566,
"mc1_stderr": 0.017201949234553104,
"mc2": 0.5719887946496497,
"mc2_stderr": 0.01564526507191298
},
"harness|winogrande|5": {
"acc": 0.8050513022888713,
"acc_stderr": 0.01113409941593827
},
"harness|gsm8k|5": {
"acc": 0.47536012130401817,
"acc_stderr": 0.013755751352764916
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-Mixtral-8x7B | [
"region:us"
] | 2023-12-23T23:11:23+00:00 | {"pretty_name": "Evaluation run of VAGOsolutions/SauerkrautLM-Mixtral-8x7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [VAGOsolutions/SauerkrautLM-Mixtral-8x7B](https://huggingface.co/VAGOsolutions/SauerkrautLM-Mixtral-8x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-Mixtral-8x7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-23T23:09:05.690555](https://huggingface.co/datasets/open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-Mixtral-8x7B/blob/main/results_2023-12-23T23-09-05.690555.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6660391753947925,\n \"acc_stderr\": 0.03158802076488222,\n \"acc_norm\": 0.6705871411561132,\n \"acc_norm_stderr\": 0.03221037708587965,\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.017201949234553104,\n \"mc2\": 0.5719887946496497,\n \"mc2_stderr\": 0.01564526507191298\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6544368600682594,\n \"acc_stderr\": 0.013896938461145678,\n \"acc_norm\": 0.6885665529010239,\n \"acc_norm_stderr\": 0.013532472099850935\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6775542720573591,\n \"acc_stderr\": 0.004664572784985589,\n \"acc_norm\": 0.8600876319458275,\n \"acc_norm_stderr\": 0.0034618713240671967\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7433962264150943,\n \"acc_stderr\": 0.02688064788905199,\n \"acc_norm\": 0.7433962264150943,\n \"acc_norm_stderr\": 0.02688064788905199\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6170212765957447,\n \"acc_stderr\": 0.03177821250236922,\n \"acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.03177821250236922\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.046774730044912,\n \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.046774730044912\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4576719576719577,\n \"acc_stderr\": 0.025658868862058322,\n \"acc_norm\": 0.4576719576719577,\n \"acc_norm_stderr\": 0.025658868862058322\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568532,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568532\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8232323232323232,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.8232323232323232,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971118,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971118\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815642,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815642\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7226890756302521,\n \"acc_stderr\": 0.029079374539480007,\n \"acc_norm\": 0.7226890756302521,\n \"acc_norm_stderr\": 0.029079374539480007\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.03995524007681681,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.03995524007681681\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8605504587155963,\n \"acc_stderr\": 0.014852421490033055,\n \"acc_norm\": 0.8605504587155963,\n \"acc_norm_stderr\": 0.014852421490033055\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.0340763209385405,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.0340763209385405\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8438818565400844,\n \"acc_stderr\": 0.023627159460318674,\n \"acc_norm\": 0.8438818565400844,\n \"acc_norm_stderr\": 0.023627159460318674\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7443946188340808,\n \"acc_stderr\": 0.029275891003969923,\n \"acc_norm\": 0.7443946188340808,\n \"acc_norm_stderr\": 0.029275891003969923\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8347107438016529,\n \"acc_stderr\": 0.033907806129727755,\n \"acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.033907806129727755\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n \"acc_stderr\": 0.019875655027867447,\n \"acc_norm\": 0.8974358974358975,\n \"acc_norm_stderr\": 0.019875655027867447\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.855683269476373,\n \"acc_stderr\": 0.012566417503320946,\n \"acc_norm\": 0.855683269476373,\n \"acc_norm_stderr\": 0.012566417503320946\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39217877094972065,\n \"acc_stderr\": 0.016329061073207442,\n \"acc_norm\": 0.39217877094972065,\n \"acc_norm_stderr\": 0.016329061073207442\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445803,\n \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445803\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48370273794002605,\n \"acc_stderr\": 0.01276345073469982,\n \"acc_norm\": 0.48370273794002605,\n \"acc_norm_stderr\": 0.01276345073469982\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7107843137254902,\n \"acc_stderr\": 0.018342529845275908,\n \"acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.018342529845275908\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.017201949234553104,\n \"mc2\": 0.5719887946496497,\n \"mc2_stderr\": 0.01564526507191298\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.01113409941593827\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.47536012130401817,\n \"acc_stderr\": 0.013755751352764916\n }\n}\n```", "repo_url": "https://huggingface.co/VAGOsolutions/SauerkrautLM-Mixtral-8x7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|arc:challenge|25_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|gsm8k|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hellaswag|10_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-23T23-09-05.690555.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["**/details_harness|winogrande|5_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-23T23-09-05.690555.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_23T23_09_05.690555", "path": ["results_2023-12-23T23-09-05.690555.parquet"]}, {"split": "latest", "path": ["results_2023-12-23T23-09-05.690555.parquet"]}]}]} | 2023-12-23T23:11:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of VAGOsolutions/SauerkrautLM-Mixtral-8x7B
Dataset automatically created during the evaluation run of model VAGOsolutions/SauerkrautLM-Mixtral-8x7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-23T23:09:05.690555(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of VAGOsolutions/SauerkrautLM-Mixtral-8x7B\n\n\n\nDataset automatically created during the evaluation run of model VAGOsolutions/SauerkrautLM-Mixtral-8x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T23:09:05.690555(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of VAGOsolutions/SauerkrautLM-Mixtral-8x7B\n\n\n\nDataset automatically created during the evaluation run of model VAGOsolutions/SauerkrautLM-Mixtral-8x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-23T23:09:05.690555(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of VAGOsolutions/SauerkrautLM-Mixtral-8x7B\n\n\n\nDataset automatically created during the evaluation run of model VAGOsolutions/SauerkrautLM-Mixtral-8x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-23T23:09:05.690555(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
612bc5039a457880d9e7d84c3b0a4cf154b70e4f |
# 2WikiMultihopQA: A Dataset for Comprehensive Evaluation of Reasoning Steps
Official mirror of <https://github.com/Alab-NII/2wikimultihop> | xanhho/2WikiMultihopQA | [
"task_categories:question-answering",
"size_categories:100K<n<1M",
"language:en",
"license:apache-2.0",
"region:us"
] | 2023-12-23T23:37:26+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["100K<n<1M"], "task_categories": ["question-answering"]} | 2024-01-20T12:39:38+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #size_categories-100K<n<1M #language-English #license-apache-2.0 #region-us
|
# 2WikiMultihopQA: A Dataset for Comprehensive Evaluation of Reasoning Steps
Official mirror of <URL | [
"# 2WikiMultihopQA: A Dataset for Comprehensive Evaluation of Reasoning Steps\n\nOfficial mirror of <URL"
] | [
"TAGS\n#task_categories-question-answering #size_categories-100K<n<1M #language-English #license-apache-2.0 #region-us \n",
"# 2WikiMultihopQA: A Dataset for Comprehensive Evaluation of Reasoning Steps\n\nOfficial mirror of <URL"
] | [
42,
27
] | [
"passage: TAGS\n#task_categories-question-answering #size_categories-100K<n<1M #language-English #license-apache-2.0 #region-us \n# 2WikiMultihopQA: A Dataset for Comprehensive Evaluation of Reasoning Steps\n\nOfficial mirror of <URL"
] |
ef4a78b648b94c90d99d7d383a99372dab17456c |
# Dataset Card for Evaluation run of chargoddard/MixtralRPChat-ZLoss
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chargoddard/MixtralRPChat-ZLoss](https://huggingface.co/chargoddard/MixtralRPChat-ZLoss) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__MixtralRPChat-ZLoss",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T00:10:11.003805](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__MixtralRPChat-ZLoss/blob/main/results_2023-12-24T00-10-11.003805.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7014965083144905,
"acc_stderr": 0.030525173302251206,
"acc_norm": 0.7067661366946931,
"acc_norm_stderr": 0.031115835600048672,
"mc1": 0.386780905752754,
"mc1_stderr": 0.01704885701051511,
"mc2": 0.5385273808900092,
"mc2_stderr": 0.015024918935321629
},
"harness|arc:challenge|25": {
"acc": 0.6510238907849829,
"acc_stderr": 0.013928933461382501,
"acc_norm": 0.6860068259385665,
"acc_norm_stderr": 0.013562691224726291
},
"harness|hellaswag|10": {
"acc": 0.6623182632941645,
"acc_stderr": 0.004719529099913136,
"acc_norm": 0.8609838677554272,
"acc_norm_stderr": 0.0034525630964691227
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.040247784019771096,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.040247784019771096
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7924528301886793,
"acc_stderr": 0.024959918028911274,
"acc_norm": 0.7924528301886793,
"acc_norm_stderr": 0.024959918028911274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8402777777777778,
"acc_stderr": 0.030635578972093274,
"acc_norm": 0.8402777777777778,
"acc_norm_stderr": 0.030635578972093274
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.034564257450869995,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.034564257450869995
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6808510638297872,
"acc_stderr": 0.030472973363380045,
"acc_norm": 0.6808510638297872,
"acc_norm_stderr": 0.030472973363380045
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.025722097064388535,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.025722097064388535
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8387096774193549,
"acc_stderr": 0.020923327006423298,
"acc_norm": 0.8387096774193549,
"acc_norm_stderr": 0.020923327006423298
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.625615763546798,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.625615763546798,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.0315841532404771,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.0315841532404771
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078912,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078912
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7025641025641025,
"acc_stderr": 0.023177408131465942,
"acc_norm": 0.7025641025641025,
"acc_norm_stderr": 0.023177408131465942
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8109243697478992,
"acc_stderr": 0.025435119438105364,
"acc_norm": 0.8109243697478992,
"acc_norm_stderr": 0.025435119438105364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.44370860927152317,
"acc_stderr": 0.04056527902281732,
"acc_norm": 0.44370860927152317,
"acc_norm_stderr": 0.04056527902281732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8788990825688073,
"acc_stderr": 0.013987618292389713,
"acc_norm": 0.8788990825688073,
"acc_norm_stderr": 0.013987618292389713
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801588,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801588
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.020681745135884562,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.020681745135884562
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7668161434977578,
"acc_stderr": 0.028380391147094702,
"acc_norm": 0.7668161434977578,
"acc_norm_stderr": 0.028380391147094702
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.03278548537343138,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.03278548537343138
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.030922788320445784,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.030922788320445784
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.017893784904018533,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.017893784904018533
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8697318007662835,
"acc_stderr": 0.012036729568216052,
"acc_norm": 0.8697318007662835,
"acc_norm_stderr": 0.012036729568216052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7658959537572254,
"acc_stderr": 0.022797110278071134,
"acc_norm": 0.7658959537572254,
"acc_norm_stderr": 0.022797110278071134
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4569832402234637,
"acc_stderr": 0.01666049858050917,
"acc_norm": 0.4569832402234637,
"acc_norm_stderr": 0.01666049858050917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7973856209150327,
"acc_stderr": 0.02301544687798568,
"acc_norm": 0.7973856209150327,
"acc_norm_stderr": 0.02301544687798568
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.021185893615225174,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.021185893615225174
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.549645390070922,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.549645390070922,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5130378096479792,
"acc_stderr": 0.012765893883835332,
"acc_norm": 0.5130378096479792,
"acc_norm_stderr": 0.012765893883835332
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.025767252010855952,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.025767252010855952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.025607375986579157,
"acc_norm": 0.8,
"acc_norm_stderr": 0.025607375986579157
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.024112678240900808,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.024112678240900808
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.386780905752754,
"mc1_stderr": 0.01704885701051511,
"mc2": 0.5385273808900092,
"mc2_stderr": 0.015024918935321629
},
"harness|winogrande|5": {
"acc": 0.8200473559589582,
"acc_stderr": 0.010796468688068682
},
"harness|gsm8k|5": {
"acc": 0.5056861258529188,
"acc_stderr": 0.013771594106283033
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_chargoddard__MixtralRPChat-ZLoss | [
"region:us"
] | 2023-12-24T00:12:27+00:00 | {"pretty_name": "Evaluation run of chargoddard/MixtralRPChat-ZLoss", "dataset_summary": "Dataset automatically created during the evaluation run of model [chargoddard/MixtralRPChat-ZLoss](https://huggingface.co/chargoddard/MixtralRPChat-ZLoss) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__MixtralRPChat-ZLoss\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-24T00:10:11.003805](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__MixtralRPChat-ZLoss/blob/main/results_2023-12-24T00-10-11.003805.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7014965083144905,\n \"acc_stderr\": 0.030525173302251206,\n \"acc_norm\": 0.7067661366946931,\n \"acc_norm_stderr\": 0.031115835600048672,\n \"mc1\": 0.386780905752754,\n \"mc1_stderr\": 0.01704885701051511,\n \"mc2\": 0.5385273808900092,\n \"mc2_stderr\": 0.015024918935321629\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6510238907849829,\n \"acc_stderr\": 0.013928933461382501,\n \"acc_norm\": 0.6860068259385665,\n \"acc_norm_stderr\": 0.013562691224726291\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6623182632941645,\n \"acc_stderr\": 0.004719529099913136,\n \"acc_norm\": 0.8609838677554272,\n \"acc_norm_stderr\": 0.0034525630964691227\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.6814814814814815,\n \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7924528301886793,\n \"acc_stderr\": 0.024959918028911274,\n \"acc_norm\": 0.7924528301886793,\n \"acc_norm_stderr\": 0.024959918028911274\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n \"acc_stderr\": 0.030635578972093274,\n \"acc_norm\": 0.8402777777777778,\n \"acc_norm_stderr\": 0.030635578972093274\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.034564257450869995,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.034564257450869995\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6808510638297872,\n \"acc_stderr\": 0.030472973363380045,\n \"acc_norm\": 0.6808510638297872,\n \"acc_norm_stderr\": 0.030472973363380045\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.025722097064388535,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.025722097064388535\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8387096774193549,\n \"acc_stderr\": 0.020923327006423298,\n \"acc_norm\": 0.8387096774193549,\n \"acc_norm_stderr\": 0.020923327006423298\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.625615763546798,\n \"acc_stderr\": 0.03405155380561952,\n \"acc_norm\": 0.625615763546798,\n \"acc_norm_stderr\": 0.03405155380561952\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078912,\n \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078912\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465942,\n \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465942\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8109243697478992,\n \"acc_stderr\": 0.025435119438105364,\n \"acc_norm\": 0.8109243697478992,\n \"acc_norm_stderr\": 0.025435119438105364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.44370860927152317,\n \"acc_stderr\": 0.04056527902281732,\n \"acc_norm\": 0.44370860927152317,\n \"acc_norm_stderr\": 0.04056527902281732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8788990825688073,\n \"acc_stderr\": 0.013987618292389713,\n \"acc_norm\": 0.8788990825688073,\n \"acc_norm_stderr\": 0.013987618292389713\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801588,\n \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801588\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884562,\n \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884562\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n \"acc_stderr\": 0.028380391147094702,\n \"acc_norm\": 0.7668161434977578,\n \"acc_norm_stderr\": 0.028380391147094702\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.03278548537343138,\n \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.03278548537343138\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445784,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445784\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n \"acc_stderr\": 0.017893784904018533,\n \"acc_norm\": 0.9188034188034188,\n \"acc_norm_stderr\": 0.017893784904018533\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8697318007662835,\n \"acc_stderr\": 0.012036729568216052,\n \"acc_norm\": 0.8697318007662835,\n \"acc_norm_stderr\": 0.012036729568216052\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7658959537572254,\n \"acc_stderr\": 0.022797110278071134,\n \"acc_norm\": 0.7658959537572254,\n \"acc_norm_stderr\": 0.022797110278071134\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4569832402234637,\n \"acc_stderr\": 0.01666049858050917,\n \"acc_norm\": 0.4569832402234637,\n \"acc_norm_stderr\": 0.01666049858050917\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7973856209150327,\n \"acc_stderr\": 0.02301544687798568,\n \"acc_norm\": 0.7973856209150327,\n \"acc_norm_stderr\": 0.02301544687798568\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.021185893615225174,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.021185893615225174\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.549645390070922,\n \"acc_stderr\": 0.02968010556502904,\n \"acc_norm\": 0.549645390070922,\n \"acc_norm_stderr\": 0.02968010556502904\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5130378096479792,\n \"acc_stderr\": 0.012765893883835332,\n \"acc_norm\": 0.5130378096479792,\n \"acc_norm_stderr\": 0.012765893883835332\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.025767252010855952,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.025767252010855952\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.017479487001364764,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.017479487001364764\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.025607375986579157,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.025607375986579157\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n \"acc_stderr\": 0.024112678240900808,\n \"acc_norm\": 0.8656716417910447,\n \"acc_norm_stderr\": 0.024112678240900808\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.386780905752754,\n \"mc1_stderr\": 0.01704885701051511,\n \"mc2\": 0.5385273808900092,\n \"mc2_stderr\": 0.015024918935321629\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.010796468688068682\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5056861258529188,\n \"acc_stderr\": 0.013771594106283033\n }\n}\n```", "repo_url": "https://huggingface.co/chargoddard/MixtralRPChat-ZLoss", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|arc:challenge|25_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|gsm8k|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hellaswag|10_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T00-10-11.003805.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["**/details_harness|winogrande|5_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-24T00-10-11.003805.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_24T00_10_11.003805", "path": ["results_2023-12-24T00-10-11.003805.parquet"]}, {"split": "latest", "path": ["results_2023-12-24T00-10-11.003805.parquet"]}]}]} | 2023-12-24T00:12:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of chargoddard/MixtralRPChat-ZLoss
Dataset automatically created during the evaluation run of model chargoddard/MixtralRPChat-ZLoss on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-24T00:10:11.003805(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of chargoddard/MixtralRPChat-ZLoss\n\n\n\nDataset automatically created during the evaluation run of model chargoddard/MixtralRPChat-ZLoss on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T00:10:11.003805(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of chargoddard/MixtralRPChat-ZLoss\n\n\n\nDataset automatically created during the evaluation run of model chargoddard/MixtralRPChat-ZLoss on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T00:10:11.003805(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of chargoddard/MixtralRPChat-ZLoss\n\n\n\nDataset automatically created during the evaluation run of model chargoddard/MixtralRPChat-ZLoss on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-24T00:10:11.003805(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
10e05c95399cc932005344a9a45250bfbd6ea3a4 |
# Dataset Card for Evaluation run of VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct](https://huggingface.co/VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-Mixtral-8x7B-Instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T17:52:13.731168](https://huggingface.co/datasets/open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-Mixtral-8x7B-Instruct/blob/main/results_2023-12-29T17-52-13.731168.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7125714200175716,
"acc_stderr": 0.030241466212459073,
"acc_norm": 0.7162852145239326,
"acc_norm_stderr": 0.03082439912507631,
"mc1": 0.5067319461444308,
"mc1_stderr": 0.017501914492655396,
"mc2": 0.6571120138409167,
"mc2_stderr": 0.015075157168756756
},
"harness|arc:challenge|25": {
"acc": 0.6783276450511946,
"acc_stderr": 0.013650488084494162,
"acc_norm": 0.7047781569965871,
"acc_norm_stderr": 0.01332975029338232
},
"harness|hellaswag|10": {
"acc": 0.6898028281218881,
"acc_stderr": 0.004616288245259752,
"acc_norm": 0.8775144393547102,
"acc_norm_stderr": 0.0032717574453291543
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7735849056603774,
"acc_stderr": 0.02575755989310673,
"acc_norm": 0.7735849056603774,
"acc_norm_stderr": 0.02575755989310673
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.032424147574830975,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.032424147574830975
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6638297872340425,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.6638297872340425,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.04598188057816542,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.04598188057816542
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03960933549451208,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03960933549451208
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.025733641991838987,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.025733641991838987
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8548387096774194,
"acc_stderr": 0.02003956362805329,
"acc_norm": 0.8548387096774194,
"acc_norm_stderr": 0.02003956362805329
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6354679802955665,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.6354679802955665,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.01438543285747646,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.01438543285747646
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7102564102564103,
"acc_stderr": 0.023000628243687964,
"acc_norm": 0.7102564102564103,
"acc_norm_stderr": 0.023000628243687964
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.029670906124630882,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.029670906124630882
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8109243697478992,
"acc_stderr": 0.025435119438105364,
"acc_norm": 0.8109243697478992,
"acc_norm_stderr": 0.025435119438105364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.44370860927152317,
"acc_stderr": 0.04056527902281731,
"acc_norm": 0.44370860927152317,
"acc_norm_stderr": 0.04056527902281731
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8825688073394495,
"acc_stderr": 0.013802780227377347,
"acc_norm": 0.8825688073394495,
"acc_norm_stderr": 0.013802780227377347
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.033509916046960436,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.033509916046960436
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.022535526352692705,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.022535526352692705
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7623318385650224,
"acc_stderr": 0.02856807946471428,
"acc_norm": 0.7623318385650224,
"acc_norm_stderr": 0.02856807946471428
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462469,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971716,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971716
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5803571428571429,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.5803571428571429,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.035865947385739734,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.035865947385739734
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.01789378490401853,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.01789378490401853
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.876117496807152,
"acc_stderr": 0.011781017100950737,
"acc_norm": 0.876117496807152,
"acc_norm_stderr": 0.011781017100950737
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46033519553072627,
"acc_stderr": 0.016669799592112032,
"acc_norm": 0.46033519553072627,
"acc_norm_stderr": 0.016669799592112032
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.826797385620915,
"acc_stderr": 0.021668400256514266,
"acc_norm": 0.826797385620915,
"acc_norm_stderr": 0.021668400256514266
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.797427652733119,
"acc_stderr": 0.022827317491059686,
"acc_norm": 0.797427652733119,
"acc_norm_stderr": 0.022827317491059686
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8395061728395061,
"acc_stderr": 0.020423955354778027,
"acc_norm": 0.8395061728395061,
"acc_norm_stderr": 0.020423955354778027
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5638297872340425,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.5638297872340425,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5404172099087353,
"acc_stderr": 0.012728446067669943,
"acc_norm": 0.5404172099087353,
"acc_norm_stderr": 0.012728446067669943
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7904411764705882,
"acc_stderr": 0.02472311040767707,
"acc_norm": 0.7904411764705882,
"acc_norm_stderr": 0.02472311040767707
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.01703522925803404,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.01703522925803404
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7755102040816326,
"acc_stderr": 0.0267114305555384,
"acc_norm": 0.7755102040816326,
"acc_norm_stderr": 0.0267114305555384
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101716,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101716
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366145,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366145
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5067319461444308,
"mc1_stderr": 0.017501914492655396,
"mc2": 0.6571120138409167,
"mc2_stderr": 0.015075157168756756
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.010977481103435091
},
"harness|gsm8k|5": {
"acc": 0.6080363912054587,
"acc_stderr": 0.013447140886023817
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-Mixtral-8x7B-Instruct | [
"region:us"
] | 2023-12-24T00:30:12+00:00 | {"pretty_name": "Evaluation run of VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct](https://huggingface.co/VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-Mixtral-8x7B-Instruct\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T17:52:13.731168](https://huggingface.co/datasets/open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-Mixtral-8x7B-Instruct/blob/main/results_2023-12-29T17-52-13.731168.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7125714200175716,\n \"acc_stderr\": 0.030241466212459073,\n \"acc_norm\": 0.7162852145239326,\n \"acc_norm_stderr\": 0.03082439912507631,\n \"mc1\": 0.5067319461444308,\n \"mc1_stderr\": 0.017501914492655396,\n \"mc2\": 0.6571120138409167,\n \"mc2_stderr\": 0.015075157168756756\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6783276450511946,\n \"acc_stderr\": 0.013650488084494162,\n \"acc_norm\": 0.7047781569965871,\n \"acc_norm_stderr\": 0.01332975029338232\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6898028281218881,\n \"acc_stderr\": 0.004616288245259752,\n \"acc_norm\": 0.8775144393547102,\n \"acc_norm_stderr\": 0.0032717574453291543\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7735849056603774,\n \"acc_stderr\": 0.02575755989310673,\n \"acc_norm\": 0.7735849056603774,\n \"acc_norm_stderr\": 0.02575755989310673\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.032424147574830975,\n \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.032424147574830975\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6638297872340425,\n \"acc_stderr\": 0.030881618520676942,\n \"acc_norm\": 0.6638297872340425,\n \"acc_norm_stderr\": 0.030881618520676942\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.04598188057816542,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.04598188057816542\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03960933549451208,\n \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03960933549451208\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.025733641991838987,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.025733641991838987\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8548387096774194,\n \"acc_stderr\": 0.02003956362805329,\n \"acc_norm\": 0.8548387096774194,\n \"acc_norm_stderr\": 0.02003956362805329\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6354679802955665,\n \"acc_stderr\": 0.0338640574606209,\n \"acc_norm\": 0.6354679802955665,\n \"acc_norm_stderr\": 0.0338640574606209\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047709,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047709\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.01438543285747646,\n \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.01438543285747646\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687964,\n \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687964\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3851851851851852,\n \"acc_stderr\": 0.029670906124630882,\n \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.029670906124630882\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8109243697478992,\n \"acc_stderr\": 0.025435119438105364,\n \"acc_norm\": 0.8109243697478992,\n \"acc_norm_stderr\": 0.025435119438105364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.44370860927152317,\n \"acc_stderr\": 0.04056527902281731,\n \"acc_norm\": 0.44370860927152317,\n \"acc_norm_stderr\": 0.04056527902281731\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8825688073394495,\n \"acc_stderr\": 0.013802780227377347,\n \"acc_norm\": 0.8825688073394495,\n \"acc_norm_stderr\": 0.013802780227377347\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.033509916046960436,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.033509916046960436\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7623318385650224,\n \"acc_stderr\": 0.02856807946471428,\n \"acc_norm\": 0.7623318385650224,\n \"acc_norm_stderr\": 0.02856807946471428\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462469,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971716,\n \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971716\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5803571428571429,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.5803571428571429,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.035865947385739734,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.035865947385739734\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n \"acc_stderr\": 0.01789378490401853,\n \"acc_norm\": 0.9188034188034188,\n \"acc_norm_stderr\": 0.01789378490401853\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.876117496807152,\n \"acc_stderr\": 0.011781017100950737,\n \"acc_norm\": 0.876117496807152,\n \"acc_norm_stderr\": 0.011781017100950737\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46033519553072627,\n \"acc_stderr\": 0.016669799592112032,\n \"acc_norm\": 0.46033519553072627,\n \"acc_norm_stderr\": 0.016669799592112032\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.826797385620915,\n \"acc_stderr\": 0.021668400256514266,\n \"acc_norm\": 0.826797385620915,\n \"acc_norm_stderr\": 0.021668400256514266\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.797427652733119,\n \"acc_stderr\": 0.022827317491059686,\n \"acc_norm\": 0.797427652733119,\n \"acc_norm_stderr\": 0.022827317491059686\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8395061728395061,\n \"acc_stderr\": 0.020423955354778027,\n \"acc_norm\": 0.8395061728395061,\n \"acc_norm_stderr\": 0.020423955354778027\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5638297872340425,\n \"acc_stderr\": 0.02958345203628407,\n \"acc_norm\": 0.5638297872340425,\n \"acc_norm_stderr\": 0.02958345203628407\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5404172099087353,\n \"acc_stderr\": 0.012728446067669943,\n \"acc_norm\": 0.5404172099087353,\n \"acc_norm_stderr\": 0.012728446067669943\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7904411764705882,\n \"acc_stderr\": 0.02472311040767707,\n \"acc_norm\": 0.7904411764705882,\n \"acc_norm_stderr\": 0.02472311040767707\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.01703522925803404,\n \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.01703522925803404\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7755102040816326,\n \"acc_stderr\": 0.0267114305555384,\n \"acc_norm\": 0.7755102040816326,\n \"acc_norm_stderr\": 0.0267114305555384\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101716,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101716\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366145,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366145\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5067319461444308,\n \"mc1_stderr\": 0.017501914492655396,\n \"mc2\": 0.6571120138409167,\n \"mc2_stderr\": 0.015075157168756756\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435091\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6080363912054587,\n \"acc_stderr\": 0.013447140886023817\n }\n}\n```", "repo_url": "https://huggingface.co/VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|arc:challenge|25_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|arc:challenge|25_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|gsm8k|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|gsm8k|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hellaswag|10_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hellaswag|10_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T00-27-55.895777.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T17-52-13.731168.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["**/details_harness|winogrande|5_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["**/details_harness|winogrande|5_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T17-52-13.731168.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_24T00_27_55.895777", "path": ["results_2023-12-24T00-27-55.895777.parquet"]}, {"split": "2023_12_29T17_52_13.731168", "path": ["results_2023-12-29T17-52-13.731168.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T17-52-13.731168.parquet"]}]}]} | 2023-12-29T17:54:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct
Dataset automatically created during the evaluation run of model VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T17:52:13.731168(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct\n\n\n\nDataset automatically created during the evaluation run of model VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T17:52:13.731168(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct\n\n\n\nDataset automatically created during the evaluation run of model VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T17:52:13.731168(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
201,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct\n\n\n\nDataset automatically created during the evaluation run of model VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T17:52:13.731168(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
aa6b10ab7f2da698a01870de862aeb9a127bbfee | # Dataset Card for "pokemon-with-pokedex-descriptions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | jev217/pokemon-with-pokedex-descriptions | [
"region:us"
] | 2023-12-24T00:33:04+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 992086.515, "num_examples": 1017}], "download_size": 1007885, "dataset_size": 992086.515}} | 2023-12-24T00:34:06+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "pokemon-with-pokedex-descriptions"
More Information needed | [
"# Dataset Card for \"pokemon-with-pokedex-descriptions\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"pokemon-with-pokedex-descriptions\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"pokemon-with-pokedex-descriptions\"\n\nMore Information needed"
] |
53cf7b4b1e8781cb7f340cc914ff54b3a5eed6b4 | # Everything Has Context | contrived company research example (ehc-contrived-financial)
### 📝 Description
'company_revenue_train.csv'(Citation : train.csv of dylanalloy/ehc-contrived-financial) dataset contains 12,515 rows of high-quality contrived<sup>1</sup> research patterns in the public market equities category for Q/A pairs with a high perplexity<sup>2</sup>.
The data is generated from `davinci-turbo` using the OpenAI API with prompts engineered to do several things which incite a grounded hallucinatory research example each call:
1. Generate one-shot Q/A example with a mask for the subject using the syntax `[Company]` which has a high perplexity thus requires multiple follow up questions (or the answer itself requires two sources of external context).
2. Between the question and answer of each one-shot example, hallucinate context from a search of equity filings data required to get to the answer.
3. Replace `[Company]` instances with a random company from a list in our case of 118 companies<sup>*</sup>
4. Filter on all rows for conditions which suit your needs (we choose higher perplexity which we define in a contrived dataset as: `∀(context,followup)∈S, where S is the dataset, and ∣{(context,followup)}∣>2`)
### 🙈 Contrived!
It's not real context. We are researching what this means for compositionality gaps in the respective domain for the model finetuning. There are perhaps more obvious limitations around the ability to reason on questions with high perplexity involved which the model has not been finetuned on, especially as reasoning about the question's context requirements could grow. Naively-posed questions, loaded questions, or questions of a contradictory manner may throw off the reasoning and context retrieval abilities of a finetuned model derived from a contrived 'environment', if you will. These are just some of the challenges which may be posed using a contrived set of Q/A context-driven dataset.
## 🧑💻 Other Datasets for Everything Has Context
1️⃣ <i>real world context:</i> not out yet but it's comin'. I have the context though I don't have the generations, give it a week max from this README commit's date.
2️⃣ <i>databricks-dolly-15k x real world context:</i> see 1
----
#### 💬 Citation
<sup>*</sup> <small>we do this after the work in 1, 2 because it removes the potential of sticky base model knowledge affecting the context and Q/A diversity! we do only 118 companies because the company names don't matter, facts in context do</small>
<sup>1</sup> <small>contrived is a term we use here to say there was a prompt engineered to create the data from a world-class model
<sup>2</sup> <small>@misc{press2023measuring,
title={Measuring and Narrowing the Compositionality Gap in Language Models},
author={Ofir Press and Muru Zhang and Sewon Min and Ludwig Schmidt and Noah A. Smith and Mike Lewis},
year={2023},
eprint={2210.03350},
archivePrefix={arXiv},
primaryClass={cs.CL}
}</small> | csujeong/financial_company_revenue | [
"task_categories:question-answering",
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"arxiv:2210.03350",
"region:us"
] | 2023-12-24T00:59:19+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering"], "pretty_name": "ehc-contrived-financial"} | 2023-12-24T09:45:17+00:00 | [
"2210.03350"
] | [
"en"
] | TAGS
#task_categories-question-answering #size_categories-10K<n<100K #language-English #license-apache-2.0 #arxiv-2210.03350 #region-us
| # Everything Has Context | contrived company research example (ehc-contrived-financial)
### Description
'company_revenue_train.csv'(Citation : URL of dylanalloy/ehc-contrived-financial) dataset contains 12,515 rows of high-quality contrived<sup>1</sup> research patterns in the public market equities category for Q/A pairs with a high perplexity<sup>2</sup>.
The data is generated from 'davinci-turbo' using the OpenAI API with prompts engineered to do several things which incite a grounded hallucinatory research example each call:
1. Generate one-shot Q/A example with a mask for the subject using the syntax '[Company]' which has a high perplexity thus requires multiple follow up questions (or the answer itself requires two sources of external context).
2. Between the question and answer of each one-shot example, hallucinate context from a search of equity filings data required to get to the answer.
3. Replace '[Company]' instances with a random company from a list in our case of 118 companies<sup>*</sup>
4. Filter on all rows for conditions which suit your needs (we choose higher perplexity which we define in a contrived dataset as: '∀(context,followup)∈S, where S is the dataset, and ∣{(context,followup)}∣>2')
### Contrived!
It's not real context. We are researching what this means for compositionality gaps in the respective domain for the model finetuning. There are perhaps more obvious limitations around the ability to reason on questions with high perplexity involved which the model has not been finetuned on, especially as reasoning about the question's context requirements could grow. Naively-posed questions, loaded questions, or questions of a contradictory manner may throw off the reasoning and context retrieval abilities of a finetuned model derived from a contrived 'environment', if you will. These are just some of the challenges which may be posed using a contrived set of Q/A context-driven dataset.
## Other Datasets for Everything Has Context
1️⃣ <i>real world context:</i> not out yet but it's comin'. I have the context though I don't have the generations, give it a week max from this README commit's date.
2️⃣ <i>databricks-dolly-15k x real world context:</i> see 1
----
#### Citation
<sup>*</sup> <small>we do this after the work in 1, 2 because it removes the potential of sticky base model knowledge affecting the context and Q/A diversity! we do only 118 companies because the company names don't matter, facts in context do</small>
<sup>1</sup> <small>contrived is a term we use here to say there was a prompt engineered to create the data from a world-class model
<sup>2</sup> <small>@misc{press2023measuring,
title={Measuring and Narrowing the Compositionality Gap in Language Models},
author={Ofir Press and Muru Zhang and Sewon Min and Ludwig Schmidt and Noah A. Smith and Mike Lewis},
year={2023},
eprint={2210.03350},
archivePrefix={arXiv},
primaryClass={cs.CL}
}</small> | [
"# Everything Has Context | contrived company research example (ehc-contrived-financial)",
"### Description \n\n'company_revenue_train.csv'(Citation : URL of dylanalloy/ehc-contrived-financial) dataset contains 12,515 rows of high-quality contrived<sup>1</sup> research patterns in the public market equities category for Q/A pairs with a high perplexity<sup>2</sup>. \n\nThe data is generated from 'davinci-turbo' using the OpenAI API with prompts engineered to do several things which incite a grounded hallucinatory research example each call:\n\n1. Generate one-shot Q/A example with a mask for the subject using the syntax '[Company]' which has a high perplexity thus requires multiple follow up questions (or the answer itself requires two sources of external context).\n2. Between the question and answer of each one-shot example, hallucinate context from a search of equity filings data required to get to the answer.\n3. Replace '[Company]' instances with a random company from a list in our case of 118 companies<sup>*</sup>\n4. Filter on all rows for conditions which suit your needs (we choose higher perplexity which we define in a contrived dataset as: '∀(context,followup)∈S, where S is the dataset, and ∣{(context,followup)}∣>2')",
"### Contrived!\nIt's not real context. We are researching what this means for compositionality gaps in the respective domain for the model finetuning. There are perhaps more obvious limitations around the ability to reason on questions with high perplexity involved which the model has not been finetuned on, especially as reasoning about the question's context requirements could grow. Naively-posed questions, loaded questions, or questions of a contradictory manner may throw off the reasoning and context retrieval abilities of a finetuned model derived from a contrived 'environment', if you will. These are just some of the challenges which may be posed using a contrived set of Q/A context-driven dataset.",
"## Other Datasets for Everything Has Context\n\n1️⃣ <i>real world context:</i> not out yet but it's comin'. I have the context though I don't have the generations, give it a week max from this README commit's date.\n\n2️⃣ <i>databricks-dolly-15k x real world context:</i> see 1\n\n----",
"#### Citation\n\n<sup>*</sup> <small>we do this after the work in 1, 2 because it removes the potential of sticky base model knowledge affecting the context and Q/A diversity! we do only 118 companies because the company names don't matter, facts in context do</small>\n\n<sup>1</sup> <small>contrived is a term we use here to say there was a prompt engineered to create the data from a world-class model\n\n<sup>2</sup> <small>@misc{press2023measuring,\n title={Measuring and Narrowing the Compositionality Gap in Language Models}, \n author={Ofir Press and Muru Zhang and Sewon Min and Ludwig Schmidt and Noah A. Smith and Mike Lewis},\n year={2023},\n eprint={2210.03350},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}</small>"
] | [
"TAGS\n#task_categories-question-answering #size_categories-10K<n<100K #language-English #license-apache-2.0 #arxiv-2210.03350 #region-us \n",
"# Everything Has Context | contrived company research example (ehc-contrived-financial)",
"### Description \n\n'company_revenue_train.csv'(Citation : URL of dylanalloy/ehc-contrived-financial) dataset contains 12,515 rows of high-quality contrived<sup>1</sup> research patterns in the public market equities category for Q/A pairs with a high perplexity<sup>2</sup>. \n\nThe data is generated from 'davinci-turbo' using the OpenAI API with prompts engineered to do several things which incite a grounded hallucinatory research example each call:\n\n1. Generate one-shot Q/A example with a mask for the subject using the syntax '[Company]' which has a high perplexity thus requires multiple follow up questions (or the answer itself requires two sources of external context).\n2. Between the question and answer of each one-shot example, hallucinate context from a search of equity filings data required to get to the answer.\n3. Replace '[Company]' instances with a random company from a list in our case of 118 companies<sup>*</sup>\n4. Filter on all rows for conditions which suit your needs (we choose higher perplexity which we define in a contrived dataset as: '∀(context,followup)∈S, where S is the dataset, and ∣{(context,followup)}∣>2')",
"### Contrived!\nIt's not real context. We are researching what this means for compositionality gaps in the respective domain for the model finetuning. There are perhaps more obvious limitations around the ability to reason on questions with high perplexity involved which the model has not been finetuned on, especially as reasoning about the question's context requirements could grow. Naively-posed questions, loaded questions, or questions of a contradictory manner may throw off the reasoning and context retrieval abilities of a finetuned model derived from a contrived 'environment', if you will. These are just some of the challenges which may be posed using a contrived set of Q/A context-driven dataset.",
"## Other Datasets for Everything Has Context\n\n1️⃣ <i>real world context:</i> not out yet but it's comin'. I have the context though I don't have the generations, give it a week max from this README commit's date.\n\n2️⃣ <i>databricks-dolly-15k x real world context:</i> see 1\n\n----",
"#### Citation\n\n<sup>*</sup> <small>we do this after the work in 1, 2 because it removes the potential of sticky base model knowledge affecting the context and Q/A diversity! we do only 118 companies because the company names don't matter, facts in context do</small>\n\n<sup>1</sup> <small>contrived is a term we use here to say there was a prompt engineered to create the data from a world-class model\n\n<sup>2</sup> <small>@misc{press2023measuring,\n title={Measuring and Narrowing the Compositionality Gap in Language Models}, \n author={Ofir Press and Muru Zhang and Sewon Min and Ludwig Schmidt and Noah A. Smith and Mike Lewis},\n year={2023},\n eprint={2210.03350},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}</small>"
] | [
51,
24,
322,
163,
88,
230
] | [
"passage: TAGS\n#task_categories-question-answering #size_categories-10K<n<100K #language-English #license-apache-2.0 #arxiv-2210.03350 #region-us \n# Everything Has Context | contrived company research example (ehc-contrived-financial)### Description \n\n'company_revenue_train.csv'(Citation : URL of dylanalloy/ehc-contrived-financial) dataset contains 12,515 rows of high-quality contrived<sup>1</sup> research patterns in the public market equities category for Q/A pairs with a high perplexity<sup>2</sup>. \n\nThe data is generated from 'davinci-turbo' using the OpenAI API with prompts engineered to do several things which incite a grounded hallucinatory research example each call:\n\n1. Generate one-shot Q/A example with a mask for the subject using the syntax '[Company]' which has a high perplexity thus requires multiple follow up questions (or the answer itself requires two sources of external context).\n2. Between the question and answer of each one-shot example, hallucinate context from a search of equity filings data required to get to the answer.\n3. Replace '[Company]' instances with a random company from a list in our case of 118 companies<sup>*</sup>\n4. Filter on all rows for conditions which suit your needs (we choose higher perplexity which we define in a contrived dataset as: '∀(context,followup)∈S, where S is the dataset, and ∣{(context,followup)}∣>2')"
] |
b42147dbad3abf2eb4b0b5dd0759d3625e58b06f |
# Steve Jobs Interviews Database
[Support this project on Ko-fi](https://ko-fi.com/hypersniper)
## Project Overview
This project contains multiple interviews of Steve Jobs during his time before and after Apple.
### Goal
The primary goal of this dataset was to fine-tune a language model to output Steve Jobs views and thoughts.
## Performance
The performance of this small dataset is very noteworthy. Do to the nature of the database being interview question and answer pairs the replies of the model seems to follow this pattern as well.
- **Model:** Mistral 7B (Fine-Tuned Model) [https://huggingface.co/Hypersniper/Steve_Jobs_Mistral_7B]
- **Fine-Tuning:** 14 Epochs \ 128 Lora Rank \ Loss 0.2149
### Sample Questions and Outputs
#### Question 1

#### Question 2
 | Hypersniper/Steve_Jobs_Interviews | [
"task_categories:text-generation",
"size_categories:n<1K",
"language:en",
"license:apache-2.0",
"steve jobs",
"steve",
"interviews",
"region:us"
] | 2023-12-24T01:15:50+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["n<1K"], "task_categories": ["text-generation"], "pretty_name": "Steve Jobs", "tags": ["steve jobs", "steve", "interviews"]} | 2023-12-24T01:50:35+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-n<1K #language-English #license-apache-2.0 #steve jobs #steve #interviews #region-us
|
# Steve Jobs Interviews Database
Support this project on Ko-fi
## Project Overview
This project contains multiple interviews of Steve Jobs during his time before and after Apple.
### Goal
The primary goal of this dataset was to fine-tune a language model to output Steve Jobs views and thoughts.
## Performance
The performance of this small dataset is very noteworthy. Do to the nature of the database being interview question and answer pairs the replies of the model seems to follow this pattern as well.
- Model: Mistral 7B (Fine-Tuned Model) [URL
- Fine-Tuning: 14 Epochs \ 128 Lora Rank \ Loss 0.2149
### Sample Questions and Outputs
#### Question 1
!Screenshot 2023-12-23 URL
#### Question 2
!Screenshot 2023-12-23 URL | [
"# Steve Jobs Interviews Database\n\nSupport this project on Ko-fi",
"## Project Overview\nThis project contains multiple interviews of Steve Jobs during his time before and after Apple.",
"### Goal\nThe primary goal of this dataset was to fine-tune a language model to output Steve Jobs views and thoughts.",
"## Performance\nThe performance of this small dataset is very noteworthy. Do to the nature of the database being interview question and answer pairs the replies of the model seems to follow this pattern as well.\n- Model: Mistral 7B (Fine-Tuned Model) [URL\n- Fine-Tuning: 14 Epochs \\ 128 Lora Rank \\ Loss 0.2149",
"### Sample Questions and Outputs",
"#### Question 1\n!Screenshot 2023-12-23 URL",
"#### Question 2\n!Screenshot 2023-12-23 URL"
] | [
"TAGS\n#task_categories-text-generation #size_categories-n<1K #language-English #license-apache-2.0 #steve jobs #steve #interviews #region-us \n",
"# Steve Jobs Interviews Database\n\nSupport this project on Ko-fi",
"## Project Overview\nThis project contains multiple interviews of Steve Jobs during his time before and after Apple.",
"### Goal\nThe primary goal of this dataset was to fine-tune a language model to output Steve Jobs views and thoughts.",
"## Performance\nThe performance of this small dataset is very noteworthy. Do to the nature of the database being interview question and answer pairs the replies of the model seems to follow this pattern as well.\n- Model: Mistral 7B (Fine-Tuned Model) [URL\n- Fine-Tuning: 14 Epochs \\ 128 Lora Rank \\ Loss 0.2149",
"### Sample Questions and Outputs",
"#### Question 1\n!Screenshot 2023-12-23 URL",
"#### Question 2\n!Screenshot 2023-12-23 URL"
] | [
50,
13,
22,
27,
82,
10,
13,
13
] | [
"passage: TAGS\n#task_categories-text-generation #size_categories-n<1K #language-English #license-apache-2.0 #steve jobs #steve #interviews #region-us \n# Steve Jobs Interviews Database\n\nSupport this project on Ko-fi## Project Overview\nThis project contains multiple interviews of Steve Jobs during his time before and after Apple.### Goal\nThe primary goal of this dataset was to fine-tune a language model to output Steve Jobs views and thoughts.## Performance\nThe performance of this small dataset is very noteworthy. Do to the nature of the database being interview question and answer pairs the replies of the model seems to follow this pattern as well.\n- Model: Mistral 7B (Fine-Tuned Model) [URL\n- Fine-Tuning: 14 Epochs \\ 128 Lora Rank \\ Loss 0.2149### Sample Questions and Outputs#### Question 1\n!Screenshot 2023-12-23 URL#### Question 2\n!Screenshot 2023-12-23 URL"
] |
ebc9aac3ea14c4b26469fe3ba1f8642b31ce0341 | # Dataset Card for "allnli-withnegs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gowitheflow/allnli-withnegs | [
"region:us"
] | 2023-12-24T01:43:48+00:00 | {"dataset_info": {"features": [{"name": "sentence1", "dtype": "string"}, {"name": "sentence2", "dtype": "string"}, {"name": "sentence3", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 51457205, "num_examples": 277277}], "download_size": 31419180, "dataset_size": 51457205}} | 2023-12-24T01:54:15+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "allnli-withnegs"
More Information needed | [
"# Dataset Card for \"allnli-withnegs\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"allnli-withnegs\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"allnli-withnegs\"\n\nMore Information needed"
] |
c1497a270f28d69bc552ce23f8bf9cbc820bab73 |
# Dataset Card for Evaluation run of maywell/PiVoT-MoE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [maywell/PiVoT-MoE](https://huggingface.co/maywell/PiVoT-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maywell__PiVoT-MoE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T01:47:47.057722](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-MoE/blob/main/results_2023-12-24T01-47-47.057722.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6069679478753063,
"acc_stderr": 0.03311851757787681,
"acc_norm": 0.6115463544343916,
"acc_norm_stderr": 0.03378897575698116,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693657,
"mc2": 0.5463839311843238,
"mc2_stderr": 0.016228712279771185
},
"harness|arc:challenge|25": {
"acc": 0.6015358361774744,
"acc_stderr": 0.014306946052735567,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.014034761386175458
},
"harness|hellaswag|10": {
"acc": 0.6621190997809201,
"acc_stderr": 0.004720210816162055,
"acc_norm": 0.8351921927902808,
"acc_norm_stderr": 0.003702487662126949
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.03899073687357334,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.03899073687357334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.02686020644472435,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.02686020644472435
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153327,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153327
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857413,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857413
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.017266742087630804,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.017266742087630804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.034063153607115065,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.034063153607115065
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.02782078198114969,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.02782078198114969
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597563,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597563
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016124,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.329608938547486,
"acc_stderr": 0.015721531075183877,
"acc_norm": 0.329608938547486,
"acc_norm_stderr": 0.015721531075183877
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.0264930332251459,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.0264930332251459
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195455,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195455
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.012687818419599924,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.012687818419599924
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.01911721391149515,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.01911721391149515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6766169154228856,
"acc_stderr": 0.03307615947979033,
"acc_norm": 0.6766169154228856,
"acc_norm_stderr": 0.03307615947979033
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693657,
"mc2": 0.5463839311843238,
"mc2_stderr": 0.016228712279771185
},
"harness|winogrande|5": {
"acc": 0.7632202052091555,
"acc_stderr": 0.011947592365207402
},
"harness|gsm8k|5": {
"acc": 0.3912054586808188,
"acc_stderr": 0.013442502402794302
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_maywell__PiVoT-MoE | [
"region:us"
] | 2023-12-24T01:50:13+00:00 | {"pretty_name": "Evaluation run of maywell/PiVoT-MoE", "dataset_summary": "Dataset automatically created during the evaluation run of model [maywell/PiVoT-MoE](https://huggingface.co/maywell/PiVoT-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__PiVoT-MoE\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-24T01:47:47.057722](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-MoE/blob/main/results_2023-12-24T01-47-47.057722.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6069679478753063,\n \"acc_stderr\": 0.03311851757787681,\n \"acc_norm\": 0.6115463544343916,\n \"acc_norm_stderr\": 0.03378897575698116,\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693657,\n \"mc2\": 0.5463839311843238,\n \"mc2_stderr\": 0.016228712279771185\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6015358361774744,\n \"acc_stderr\": 0.014306946052735567,\n \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175458\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6621190997809201,\n \"acc_stderr\": 0.004720210816162055,\n \"acc_norm\": 0.8351921927902808,\n \"acc_norm_stderr\": 0.003702487662126949\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.03899073687357334,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.03899073687357334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n \"acc_stderr\": 0.02686020644472435,\n \"acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.02686020644472435\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153327,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153327\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857413,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857413\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630804,\n \"acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630804\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.47685185185185186,\n \"acc_stderr\": 0.034063153607115065,\n \"acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.034063153607115065\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114969,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114969\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597563,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597563\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016124,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016124\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.329608938547486,\n \"acc_stderr\": 0.015721531075183877,\n \"acc_norm\": 0.329608938547486,\n \"acc_norm_stderr\": 0.015721531075183877\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.0264930332251459,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.0264930332251459\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n \"acc_stderr\": 0.012687818419599924,\n \"acc_norm\": 0.44328552803129073,\n \"acc_norm_stderr\": 0.012687818419599924\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.01911721391149515,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.01911721391149515\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n \"acc_stderr\": 0.03307615947979033,\n \"acc_norm\": 0.6766169154228856,\n \"acc_norm_stderr\": 0.03307615947979033\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693657,\n \"mc2\": 0.5463839311843238,\n \"mc2_stderr\": 0.016228712279771185\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.011947592365207402\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3912054586808188,\n \"acc_stderr\": 0.013442502402794302\n }\n}\n```", "repo_url": "https://huggingface.co/maywell/PiVoT-MoE", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|arc:challenge|25_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|gsm8k|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hellaswag|10_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T01-47-47.057722.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["**/details_harness|winogrande|5_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-24T01-47-47.057722.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_24T01_47_47.057722", "path": ["results_2023-12-24T01-47-47.057722.parquet"]}, {"split": "latest", "path": ["results_2023-12-24T01-47-47.057722.parquet"]}]}]} | 2023-12-24T01:50:36+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of maywell/PiVoT-MoE
Dataset automatically created during the evaluation run of model maywell/PiVoT-MoE on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-24T01:47:47.057722(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of maywell/PiVoT-MoE\n\n\n\nDataset automatically created during the evaluation run of model maywell/PiVoT-MoE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T01:47:47.057722(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of maywell/PiVoT-MoE\n\n\n\nDataset automatically created during the evaluation run of model maywell/PiVoT-MoE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T01:47:47.057722(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
179,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of maywell/PiVoT-MoE\n\n\n\nDataset automatically created during the evaluation run of model maywell/PiVoT-MoE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-24T01:47:47.057722(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
9d74a9d1894bae8a67d9f2b450d3125fa129319f |
<div align="center">
<h1>Under the Surface: Tracking the Artifactuality of LLM-Generated Data</h1>
<!-- **Authors:** -->
_**Debarati Das<sup>†</sup><sup>¶</sup>, Karin de Langis<sup>¶</sup>, Anna Martin-Boyle<sup>¶</sup>, Jaehyung Kim<sup>¶</sup>, Minhwa Lee<sup>¶</sup>, Zae Myung Kim<sup>¶</sup><br>**_
_**Shirley Anugrah Hayati, Risako Owan, Bin Hu, Ritik Sachin Parkar, Ryan Koo,
Jong Inn Park, Aahan Tyagi, Libby Ferland, Sanjali Roy, Vincent Liu**_
_**Dongyeop Kang<br>**_
_**Minnesota NLP, University of Minnesota Twin Cities**_
<!-- **Affiliations:** -->
<sup>†</sup> Project Lead,
<sup>¶</sup> Core Contribution,
<a href="https://arxiv.org/abs/2401.14698"> Arxiv </a>
<a href="https://minnesotanlp.github.io/artifact/"> Project Page </a>
</div>
## 📌 Table of Contents
- [Introduction](#🚀-introduction)
- [Dataset Structure](#📝-dataset)
- [Task Label](#1-task-label)
- [Preference](#2-preference)
- [Instructions](#3-instructions)
- [Simulation](#4-simulation)
- [Free-form Text](#5-free-form-text)
- [Citation](#📚-citation)
## 🚀 Introduction
<div align="center">
<img src="iceberg_modified.png" style="width:50%;height:auto;" align="center">
</div>
We present a pioneering effort in gathering a diverse range of text data produced by LLMs, covering everything from more structured "task labels" to open-ended "free-form text." This comprehensive collection is significant as it allows for a unique and holistic examination of LLM outputs and provides insights into how LLMs perform under varying degrees of structure and freedom, which is essential for both understanding their current state and guiding future improvements and applications.
We aggregate and conduct comprehensive stress tests on various data generated by LLMs using the existing benchmarks, offering a thorough evaluation of the quality, consistency, and reliability of LLM outputs across diverse models and scenarios, thereby providing a groundbreaking insight into their strengths and weaknesses for future research and development.
Our research emphasizes the critical need for responsible and ethical practices in creating and using LLM-generated data, advocating for collaborative efforts among stakeholders to address biases, increase diversity, and deepen the understanding of complex human opinions in LLM outputs, thereby ensuring their development benefits society ethically and sustainably.
## 📝 Dataset
The dataset consists of **five** different types of LLM-generated data: **(1) Task Labels, (2) Preference, (3) Instructions, (4) Simulation, and (5) Free-form Texts**.
<hr>
### 1. Task Label
#### (1) Dataset Info
Contains human/machine annotations from source datasets and their majority/minority label aggregations.
#### (2) Data Sources - License
- [Social Bias Frames (SBIC)](https://huggingface.co/datasets/social_bias_frames) - cc-by-4.0
- [GAB Hate Corpus (GHC)](https://osf.io/edua3/) - cc-by-4.0 International
- [Age-Related-Sentiment (Sentiment)](https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/F6EMTS) - cc-by-1.0 Universal
- [Social Chemistry (Schem5Labels)](https://github.com/mbforbes/social-chemistry-101) - CC BY-SA 4.0
#### (3) Column Info
- `'model_name'`: specifies the model that was prompted to generate the model annotations for the text. This can take values: vicuna, baize,llama2, koala, open_ai_gpt35turbo
- `'dataset_name'`: specifies the source dataset of the text. This can take values: SBIC, GHC, Sentiment, and Schem5Labels
- `'text_ind'`: this is the unique index of the text in the complete dataset
- `'text'`: this is the text which the human or machine needs to provide an annotation for
- `'prompt'`: This is the prompt provided to the model for the annotation task
- `'human_annots'`: This consists of the list of annotations generated by human annotators for this task. These are ordinal categorical variables.
- `'model_annots'`: This consists of the list of annotations generated by model annotators for this task. These are ordinal categorical variables. If a value is -1 in this list, it means the model did not return a response for this text.
- `'human_majority'`: this consists of a list containing the majority annotation value(s) among the human-annotated list for that text.
- `'machine_majority'`: this consists of a list containing the majority annotation value(s) among the machine-annotated list for that text.
- `'human_minority'`: this consists of a list containing the minority annotation value(s) among the human-annotated list for that text.
- `'machine_minority'`: this consists of a list containing the minority annotation value(s) among the machine-annotated list for that text.
#### (4) How to access
There is one subset associated with this data type:
- **task_label**: intermodel setup with majority/minority opinions aggregated from all data sources
Use the example code below to load the task label split. Change the split name.
```python
from datasets import load_dataset
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "task_label", split='train') # streaming=True (optional)
```
#### (5) Qualitative Analysis
To view examples used in the qualitative analysis regarding bias annotations, please copy and paste the below code:
```python
from datasets import load_dataset
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "qual_tasklabel", split='train')
```
#### (6) Others
For majority/minority calculation, please note the following:
- A list of values that are the majority or minority values in the passed list is returned. For example, if the given input list is [1.0,1.0,2.0,2.0,3.0], then majority value will be [1.0,2.0] and the minority value will be [3.0]
- If all values in the annotation list are -1, then no valid majority or minority can be calculated. Therefore, None is returned.
- If all unique values are present in the annotation list, then no valid majority or minority can be calculated. Therefore, None is returned.
<hr>
### 2. Preference
#### (1) Dataset Info
Contains Human/Machine Preferences from source datasets and their locality lexicon (for p2c) and entailment (for CoBBLEr) preference.
#### (2) Data Sources (License)
- [Prefer to Classify ('p2c')](https://arxiv.org/pdf/2306.04925.pdf)
- Note that the sentences are originally extracted from [DynaSent Round 2](https://huggingface.co/datasets/dynabench/dynasent/viewer/dynabench.dynasent.r2.all)
- [CoBBLEr](https://minnesotanlp.github.io/cobbler-project-page/demo/index.html)
- The sentences are originally extracted from [Eli5](https://huggingface.co/datasets/eli5) and [BigBench](https://huggingface.co/datasets/bigbench).
#### (3) Column Info
Commonly for each row, there are a pair of sentences ('sent_1' and 'sent_2'), with human and machine preferences.
- Preference Label 0: prefer sent_1
- Preference Label 1: prefer sent_2
- Preference Label 2: tie (no preference)
For p2c dataset, there are the sentiment lexicon-based preference and the difference score between the two sentences in each row.
- `'sent_1'`: sentence 1 of a pair
- `'sent_2'`: sentence 2 of a pair
- `'gold_label'`: the gold sentiment label of both `'sent_1'` and `'sent_2'` (e.g., positive/negative/neutral)
- `'human_pref'`: human preference
- `'gpt3_pref'`: GPT-3 preference
- `'lexicon_pref'`: the lexicon-based preference between `'sent_1'` and `'sent_2'`
- `'lexicon_diff'`: the difference in lexicon scores between sentence pairs
For CoBBLEr dataset, there are textual entailment-based preferences and difference scores between the sentences in each row.
- `'model_1'`: the model name that generated sentence 1
- `'model_2'`: the model name that generated sentence 2
- `'sentence_1'`: sentence 1 of a pair
- `'sentence_2'`: sentence 2 of a pair
- `'human_pref'`: human preference
- `'machine_pref'`: LLM preference (GPT-4 or ChatGPT)
- `'entail_pref'`: the entailment-based preference between `'sentence_1'` and `'sentence_2'`
- `'entail_diff'`: the difference in entailment scores (computed by RoBERTa-large-MNLI) between two sentences in a pair.
#### (4) How to access
There are three subsets associated with this data type:
- **preference_p2c**: p2c data with human and GPT-3 preferences
- **preference_cobbler_gpt4**: cobbler data with human and GPT-4 preferences
- **preference_cobbler_chatgpt**: cobbler with human and ChatGPT preferences
Use the example code below to load the subset of preference_cobbler_gpt4. Change the subset name.
```python
from datasets import load_dataset
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "preference_cobbler_gpt4", split='train')
```
#### (5) Qualitative Analysis
For `'p2c'` dataset, we release the data with each sentence in a pair annotated with extracted lexicons based on [Hayati et al (2021)](https://aclanthology.org/2021.emnlp-main.510/).
Also, for several columns in this data, their value consists of a dictionary where each key is the extracted lexicon and its value is the corresponding importance.
For example, the column `'sent_{1/2}_anger'`is a dictionary of anger-related lexicons with the corresponding importance scores in the (first/second) sentence.
Our study uses the first key with the maximum value score in each lexicon group to decide lexicon-based preferences.
To use this dataset, please note the following:
```python
import pandas as pd
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "qual_preference_p2c", split='train')
dataset = pd.DataFrame(dataset)
```
For sentence pairs of positive sentiment, we used the following columns:
- `'sent_{1/2}_{joy/politeness}_words'` and
- `'sent_{1/2}_sentiment_words'` that has values of greater than 0 (positive).
Conversely, for the pairs of negative sentiments, we used the following columns:
- `'sent_{1/2}_{anger/disgust/fear/sad/offensive}_words'`,
- `'sent_{1/2}_polite_words'` that has values of below 0 (rudeness) and
- `'sent_{1/2}_sentiment_words'` that has values of below 0 (negative).
#### (6) Others
<hr>
### 3. Instructions
#### (1) Dataset Info
(1) Human annotations of error types in 800 examples from four different synthetic instruction datasets, and (2) three random samplings of 10k samples for each of the following datasets: Cleaned Alpaca, Dolly, Self Instruct, and Supernatural Instructions. There is a total of 30k samples for each of the datasets (3 seeds each).
#### (2) Data Sources (License)
- [Unnatural Instructions](https://github.com/orhonovich/unnatural-instructions) - MIT
- [Self-Instruct](https://github.com/yizhongw/self-instruct) - Apache License 2.0
- [Alpaca-Cleaned](https://huggingface.co/datasets/yahma/alpaca-cleaned) - Creative Commons NonCommercial (CC BY-NC 4.0).
- [GPT-4-LLM](https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM) - Creative Commons NonCommercial (CC BY-NC 4.0).
- [Dolly](https://github.com/databrickslabs/dolly) - Apache License 2.0
- [Supernatural Instructions](https://github.com/allenai/natural-instructions) - Apache License 2.0
#### (3) Column Info
(1) Error Annotations
- `'instruction'`: an instruction to follow
- `'constraints'`: samples from the Unnatural Instruction set have an additional data type called `'constraints'`, which specify the form the output should take (e.g. `output should be 'True' or 'False'`)
- `'input'`: an input to the corresponding instruction
- `'output'`: the output given the corresponding instruction and the input
- `'dataset'`: the name of the source dataset that the instruction comes from
- `'QA_type'`: the question-answer type (Open-QA or Closed-QA)
- `'error'`: the error type (one of the following: incomprehensible instruction, inconsistent input, inconsistent output, and incorrect output)
- `'second_error'`: sometimes a sample contains more than one error; a second error will be denoted in this column
- `'third_error'`: a third error will be denoted in this column
#### (4) How to access
(1) Error Annotations:
- **instruction**: first-order experiment setup with error type annotations aggregated from all data sources
Use the example code below to load the instruction subset.
```python
from datasets import load_dataset
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "instruction", split='train')
```
#### (5) Qualitative Analysis
The `'instruction'` experiment is based on the manual annotations of each error type found in the synthetic datasets.
Thus, if you want to view examples for qualitative analysis, use the same split information as below:
```python
from datasets import load_dataset
import pandas as pd
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "instruction", split='train')
data = pd.read_csv(dataset)
```
#### (6) Others
**For the second-order experiment,**
Please use [this dataset (`instruction_fine-tuning_data.csv`)](https://huggingface.co/datasets/minnesotanlp/LLM-Artifacts/resolve/main/instruction_fine-tuning_data.csv).
The following is the column information:
- `'task_name'`: the name of the instruction task. Only pertains to Supernatural Instructions
- `'id'`: the Supernatural Instruction id
- `'instruction'`: an instruction to follow
- `'input'`: an input to the corresponding instruction
- `'output'`: the output given the corresponding instruction and the input
- `'categories'`: the task type. Only pertains to Supernatural Instructions
- `'source'`: the instruction source
- `'seed'`: the seed used for the random sampling. One of the following: 2021, 2022, or 2023
<hr>
### 4. Simulation
#### (1) Dataset Info
Contains (1) role-flipping information or (2) types of error in digression in simulated agent conversations.
#### (2) Data Sources (License)
- [CAMEL AI-Society](https://huggingface.co/datasets/camel-ai/ai_society) - CC-BY-NC 4.0
- [Solo Performance Prompting Grid-World (SPP)](https://github.com/MikeWangWZHL/Solo-Performance-Prompting) - N/A
#### (3) Column Info
(1) Regarding 'CAMEL':
- `'role_flipping_msg_indices'`: a list of indices of role-flipped messages in the conversation
- `'interruption_msg_indices'`: a list of indices of interruption messages in the conversation
- `'role_flipping_happens'`: boolean true when role_flipping_msg_indices is not empty
(2) Regarding 'SPP':
- `'Given Task'`: Given questions with detailed descriptions. The questions are from SPP logic grid puzzle dataset.
- `'Task Label'`: Answer to the given question, which is originally provided by SPP dataset
- `'Response of GPT-4'`: Simulated conversations by multiple agents, generated by GPT-4. These responses are also from SPP dataset itself (method-“spp_engine-devgpt4-32k_temp-0.0_topp-1.0_start0-end200__with_sys_mes”).
- `'Prediction of digression by GPT-4'`: Binary prediction (yes or no) about the existence of digression within (c) the simulated conversation.
- `'Reasoning of digression by GPT-4'`: Reasoning about (d) the prediction of digression.
- `'Classification of digression'`: For the simulated conversation predicted to have digression by (d), we further classify the types of digression using GPT-4 again. For the data without digression, this field is provided with ‘N/A’.
- `'Prediction as human-like by GPT-4'`: Binary prediction (human or ai) about the likeliness of (c) given conversation as human’s conversation.
- `'Reasoning as human-like by GPT-4'`: Reasoning about (g) the prediction as human-like.
- `'Prediction of digression by Human Annotators'`: Binary prediction (yes or no) about the existence of digression within (c) the simulated conversation, by three different human annotators.
- `'Prediction as human-like by Human Annotators'`: Binary prediction (human or ai) about the likeliness of (c) given conversation as human’s conversation, by three different human annotators.
#### (4) How to access
There are two subsets associated with this data type:
- **simulation_roleflip**: role-flipping information from CAMEL AI Society dataset
- **simulation_digression**: digression type information from SPP dataset
Use the example code below to load the digression subset. Change the subset name like this:
```python
from datasets import load_dataset
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "simulation_digression", split="train")
```
#### (5) Qualitative Analysis
Only the subset **simulation_digression** contains human/GPT annotations for each simulated conversation between agents.
Therefore, please use the following code to view the qualitative analysis part of the simulation section:
```python
from datasets import load_dataset
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "simulation_digression", split="train", streaming=True)
```
#### (6) Others
To get a better prediction and corresponding reasoning for it, please first generate the prediction, and then generate the reasoning as provided in the code.
<hr>
### 5. Free-form Text
#### (1) Dataset Info
Contains Human/Machine texts from source datasets and their classification scores.
If a machine text has a paired human text, the human text's id is associated with the machine texts.
#### (2) Data Sources - License
- [Workers vs GPT ('Workers')](https://github.com/AndersGiovanni/worker_vs_gpt) - MIT
- [Human ChatGPT Comparison Corpus ('HC3')](https://huggingface.co/datasets/Hello-SimpleAI/HC3) - BSD License
- [Deepfake Text Detection in the Wild ('Deepfake')](https://huggingface.co/datasets/yaful/DeepfakeTextDetect) - Apache License 2.0
#### (3) Column Info
**Human data** – 'text', 'label', 'id', 'anger', 'disgust', 'fear', 'joy', 'neutral', 'sadness', 'surprise', 'irony', 'toxicity', 'formality', 'metaphor'
<br>
**Machine data** – 'text', 'label', 'model', 'strat', 'human_id', 'anger', 'disgust', 'fear', 'joy', 'neutral', 'sadness', 'surprise', 'irony', 'toxicity', 'formality', 'metaphor'
- `'strat'` is the prompting strat; this is relevant for only a subset of the data; 'human_id' is the id of the human text that is its pair, if any
- `'label'` is the label for text classification
- Other attributes are just outputs from classifiers, so not GTs
#### (4) How to access
There are six subsets associated with this data type:
- **freeform_deepfake_{(human, machine)}**: human/machine outputs from Deepfake dataset
- **freeform_hc3_{(human, machine)}**: human/machine outputs from HC3 dataset
- **freeform_workers_{(human, machine)}**: human/machine outputs from Workers dataset
Use the example code below to load the subset of human outputs from deepfake dataset.
```python
from datasets import load_dataset
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "freeform_deepfake_human", split="train")
```
#### (5) Qualitative Analysis
To view examples used in the qualitative analysis, please copy and paste the below code:
```python
from datasets import load_dataset
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "qual_freeform", split="train")
```
#### (6) Others
**For Discourse artifact analyses**, please download the following two pickle files to see the network motifs:
- [Network Motiffs (Validation)](https://huggingface.co/datasets/minnesotanlp/LLM-Artifacts/resolve/main/DeepfakeTextDetect.validation.discourse_added.networkx_added.motifs_added.pkl)
- [Network Motiffs (Test)](https://huggingface.co/datasets/minnesotanlp/LLM-Artifacts/resolve/main/DeepfakeTextDetect.test.discourse_added.networkx_added.motifs_added.pkl)
<hr>
## 📚 Citation
If you use our paper or this dataset in your research, please cite it as follows:
```bibtex
@misc{das2024surface,
title={Under the Surface: Tracking the Artifactuality of LLM-Generated Data},
author={Debarati Das and Karin De Langis and Anna Martin and Jaehyung Kim and Minhwa Lee and Zae Myung Kim and Shirley Hayati and Risako Owan and Bin Hu and Ritik Parkar and Ryan Koo and Jonginn Park and Aahan Tyagi and Libby Ferland and Sanjali Roy and Vincent Liu and Dongyeop Kang},
year={2024},
eprint={2401.14698},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
If you have any questions or feedback, please feel free to reach out at [email protected].
<!-- # 🤝 Contributing -->
| minnesotanlp/LLM-Artifacts | [
"arxiv:2401.14698",
"arxiv:2306.04925",
"region:us"
] | 2023-12-24T02:53:27+00:00 | {"configs": [{"config_name": "task_label", "data_files": "intermodel_cleaned_maj_min.csv"}, {"config_name": "preference_p2c", "data_files": "p2c_human_gpt3_pref.csv"}, {"config_name": "preference_cobbler_GPT4", "data_files": "cobbler_gpt4.csv"}, {"config_name": "preference_cobbler_ChatGPT", "data_files": "cobbler_chatgpt.csv"}, {"config_name": "instruction", "data_files": "first_order_annotations.csv"}, {"config_name": "simulation_roleflip", "data_files": "CAMEL_annotated.csv"}, {"config_name": "simulation_digression", "data_files": "spp_digression_fin.csv"}, {"config_name": "freeform_deepfake_human", "data_files": "deepfake_human.csv"}, {"config_name": "freeform_deepfake_machine", "data_files": "deepfake_machine-002.csv"}, {"config_name": "freeform_hc3_human", "data_files": "hc3_human.csv"}, {"config_name": "freeform_hc3_machine", "data_files": "hc3_machine.csv"}, {"config_name": "freeform_worker_human", "data_files": "worker_human.csv"}, {"config_name": "freeform_worker_machine", "data_files": "worker_machine.csv"}, {"config_name": "qual_tasklabel", "data_files": "qual_tasklabel.csv"}, {"config_name": "qual_preference_p2c", "data_files": "qual_preference_p2c.csv"}, {"config_name": "qual_freeform", "data_files": "qual_freetext.csv"}]} | 2024-01-31T20:55:27+00:00 | [
"2401.14698",
"2306.04925"
] | [] | TAGS
#arxiv-2401.14698 #arxiv-2306.04925 #region-us
|
<div align="center">
<h1>Under the Surface: Tracking the Artifactuality of LLM-Generated Data</h1>
_Debarati Das<sup>†</sup><sup>¶</sup>, Karin de Langis<sup>¶</sup>, Anna Martin-Boyle<sup>¶</sup>, Jaehyung Kim<sup>¶</sup>, Minhwa Lee<sup>¶</sup>, Zae Myung Kim<sup>¶</sup><br>_
_Shirley Anugrah Hayati, Risako Owan, Bin Hu, Ritik Sachin Parkar, Ryan Koo,
Jong Inn Park, Aahan Tyagi, Libby Ferland, Sanjali Roy, Vincent Liu_
_Dongyeop Kang<br>_
_Minnesota NLP, University of Minnesota Twin Cities_
<sup>†</sup> Project Lead,
<sup>¶</sup> Core Contribution,
<a href="URL Arxiv </a>
<a href="URL Project Page </a>
</div>
## Table of Contents
- Introduction
- Dataset Structure
- Task Label
- Preference
- Instructions
- Simulation
- Free-form Text
- Citation
## Introduction
<div align="center">
<img src="iceberg_modified.png" style="width:50%;height:auto;" align="center">
</div>
We present a pioneering effort in gathering a diverse range of text data produced by LLMs, covering everything from more structured "task labels" to open-ended "free-form text." This comprehensive collection is significant as it allows for a unique and holistic examination of LLM outputs and provides insights into how LLMs perform under varying degrees of structure and freedom, which is essential for both understanding their current state and guiding future improvements and applications.
We aggregate and conduct comprehensive stress tests on various data generated by LLMs using the existing benchmarks, offering a thorough evaluation of the quality, consistency, and reliability of LLM outputs across diverse models and scenarios, thereby providing a groundbreaking insight into their strengths and weaknesses for future research and development.
Our research emphasizes the critical need for responsible and ethical practices in creating and using LLM-generated data, advocating for collaborative efforts among stakeholders to address biases, increase diversity, and deepen the understanding of complex human opinions in LLM outputs, thereby ensuring their development benefits society ethically and sustainably.
## Dataset
The dataset consists of five different types of LLM-generated data: (1) Task Labels, (2) Preference, (3) Instructions, (4) Simulation, and (5) Free-form Texts.
<hr>
### 1. Task Label
#### (1) Dataset Info
Contains human/machine annotations from source datasets and their majority/minority label aggregations.
#### (2) Data Sources - License
- Social Bias Frames (SBIC) - cc-by-4.0
- GAB Hate Corpus (GHC) - cc-by-4.0 International
- Age-Related-Sentiment (Sentiment) - cc-by-1.0 Universal
- Social Chemistry (Schem5Labels) - CC BY-SA 4.0
#### (3) Column Info
- ''model_name'': specifies the model that was prompted to generate the model annotations for the text. This can take values: vicuna, baize,llama2, koala, open_ai_gpt35turbo
- ''dataset_name'': specifies the source dataset of the text. This can take values: SBIC, GHC, Sentiment, and Schem5Labels
- ''text_ind'': this is the unique index of the text in the complete dataset
- ''text'': this is the text which the human or machine needs to provide an annotation for
- ''prompt'': This is the prompt provided to the model for the annotation task
- ''human_annots'': This consists of the list of annotations generated by human annotators for this task. These are ordinal categorical variables.
- ''model_annots'': This consists of the list of annotations generated by model annotators for this task. These are ordinal categorical variables. If a value is -1 in this list, it means the model did not return a response for this text.
- ''human_majority'': this consists of a list containing the majority annotation value(s) among the human-annotated list for that text.
- ''machine_majority'': this consists of a list containing the majority annotation value(s) among the machine-annotated list for that text.
- ''human_minority'': this consists of a list containing the minority annotation value(s) among the human-annotated list for that text.
- ''machine_minority'': this consists of a list containing the minority annotation value(s) among the machine-annotated list for that text.
#### (4) How to access
There is one subset associated with this data type:
- task_label: intermodel setup with majority/minority opinions aggregated from all data sources
Use the example code below to load the task label split. Change the split name.
#### (5) Qualitative Analysis
To view examples used in the qualitative analysis regarding bias annotations, please copy and paste the below code:
#### (6) Others
For majority/minority calculation, please note the following:
- A list of values that are the majority or minority values in the passed list is returned. For example, if the given input list is [1.0,1.0,2.0,2.0,3.0], then majority value will be [1.0,2.0] and the minority value will be [3.0]
- If all values in the annotation list are -1, then no valid majority or minority can be calculated. Therefore, None is returned.
- If all unique values are present in the annotation list, then no valid majority or minority can be calculated. Therefore, None is returned.
<hr>
### 2. Preference
#### (1) Dataset Info
Contains Human/Machine Preferences from source datasets and their locality lexicon (for p2c) and entailment (for CoBBLEr) preference.
#### (2) Data Sources (License)
- Prefer to Classify ('p2c')
- Note that the sentences are originally extracted from DynaSent Round 2
- CoBBLEr
- The sentences are originally extracted from Eli5 and BigBench.
#### (3) Column Info
Commonly for each row, there are a pair of sentences ('sent_1' and 'sent_2'), with human and machine preferences.
- Preference Label 0: prefer sent_1
- Preference Label 1: prefer sent_2
- Preference Label 2: tie (no preference)
For p2c dataset, there are the sentiment lexicon-based preference and the difference score between the two sentences in each row.
- ''sent_1'': sentence 1 of a pair
- ''sent_2'': sentence 2 of a pair
- ''gold_label'': the gold sentiment label of both ''sent_1'' and ''sent_2'' (e.g., positive/negative/neutral)
- ''human_pref'': human preference
- ''gpt3_pref'': GPT-3 preference
- ''lexicon_pref'': the lexicon-based preference between ''sent_1'' and ''sent_2''
- ''lexicon_diff'': the difference in lexicon scores between sentence pairs
For CoBBLEr dataset, there are textual entailment-based preferences and difference scores between the sentences in each row.
- ''model_1'': the model name that generated sentence 1
- ''model_2'': the model name that generated sentence 2
- ''sentence_1'': sentence 1 of a pair
- ''sentence_2'': sentence 2 of a pair
- ''human_pref'': human preference
- ''machine_pref'': LLM preference (GPT-4 or ChatGPT)
- ''entail_pref'': the entailment-based preference between ''sentence_1'' and ''sentence_2''
- ''entail_diff'': the difference in entailment scores (computed by RoBERTa-large-MNLI) between two sentences in a pair.
#### (4) How to access
There are three subsets associated with this data type:
- preference_p2c: p2c data with human and GPT-3 preferences
- preference_cobbler_gpt4: cobbler data with human and GPT-4 preferences
- preference_cobbler_chatgpt: cobbler with human and ChatGPT preferences
Use the example code below to load the subset of preference_cobbler_gpt4. Change the subset name.
#### (5) Qualitative Analysis
For ''p2c'' dataset, we release the data with each sentence in a pair annotated with extracted lexicons based on Hayati et al (2021).
Also, for several columns in this data, their value consists of a dictionary where each key is the extracted lexicon and its value is the corresponding importance.
For example, the column ''sent_{1/2}_anger''is a dictionary of anger-related lexicons with the corresponding importance scores in the (first/second) sentence.
Our study uses the first key with the maximum value score in each lexicon group to decide lexicon-based preferences.
To use this dataset, please note the following:
For sentence pairs of positive sentiment, we used the following columns:
- ''sent_{1/2}_{joy/politeness}_words'' and
- ''sent_{1/2}_sentiment_words'' that has values of greater than 0 (positive).
Conversely, for the pairs of negative sentiments, we used the following columns:
- ''sent_{1/2}_{anger/disgust/fear/sad/offensive}_words'',
- ''sent_{1/2}_polite_words'' that has values of below 0 (rudeness) and
- ''sent_{1/2}_sentiment_words'' that has values of below 0 (negative).
#### (6) Others
<hr>
### 3. Instructions
#### (1) Dataset Info
(1) Human annotations of error types in 800 examples from four different synthetic instruction datasets, and (2) three random samplings of 10k samples for each of the following datasets: Cleaned Alpaca, Dolly, Self Instruct, and Supernatural Instructions. There is a total of 30k samples for each of the datasets (3 seeds each).
#### (2) Data Sources (License)
- Unnatural Instructions - MIT
- Self-Instruct - Apache License 2.0
- Alpaca-Cleaned - Creative Commons NonCommercial (CC BY-NC 4.0).
- GPT-4-LLM - Creative Commons NonCommercial (CC BY-NC 4.0).
- Dolly - Apache License 2.0
- Supernatural Instructions - Apache License 2.0
#### (3) Column Info
(1) Error Annotations
- ''instruction'': an instruction to follow
- ''constraints'': samples from the Unnatural Instruction set have an additional data type called ''constraints'', which specify the form the output should take (e.g. 'output should be 'True' or 'False'')
- ''input'': an input to the corresponding instruction
- ''output'': the output given the corresponding instruction and the input
- ''dataset'': the name of the source dataset that the instruction comes from
- ''QA_type'': the question-answer type (Open-QA or Closed-QA)
- ''error'': the error type (one of the following: incomprehensible instruction, inconsistent input, inconsistent output, and incorrect output)
- ''second_error'': sometimes a sample contains more than one error; a second error will be denoted in this column
- ''third_error'': a third error will be denoted in this column
#### (4) How to access
(1) Error Annotations:
- instruction: first-order experiment setup with error type annotations aggregated from all data sources
Use the example code below to load the instruction subset.
#### (5) Qualitative Analysis
The ''instruction'' experiment is based on the manual annotations of each error type found in the synthetic datasets.
Thus, if you want to view examples for qualitative analysis, use the same split information as below:
#### (6) Others
For the second-order experiment,
Please use this dataset ('instruction_fine-tuning_data.csv').
The following is the column information:
- ''task_name'': the name of the instruction task. Only pertains to Supernatural Instructions
- ''id'': the Supernatural Instruction id
- ''instruction'': an instruction to follow
- ''input'': an input to the corresponding instruction
- ''output'': the output given the corresponding instruction and the input
- ''categories'': the task type. Only pertains to Supernatural Instructions
- ''source'': the instruction source
- ''seed'': the seed used for the random sampling. One of the following: 2021, 2022, or 2023
<hr>
### 4. Simulation
#### (1) Dataset Info
Contains (1) role-flipping information or (2) types of error in digression in simulated agent conversations.
#### (2) Data Sources (License)
- CAMEL AI-Society - CC-BY-NC 4.0
- Solo Performance Prompting Grid-World (SPP) - N/A
#### (3) Column Info
(1) Regarding 'CAMEL':
- ''role_flipping_msg_indices'': a list of indices of role-flipped messages in the conversation
- ''interruption_msg_indices'': a list of indices of interruption messages in the conversation
- ''role_flipping_happens'': boolean true when role_flipping_msg_indices is not empty
(2) Regarding 'SPP':
- ''Given Task'': Given questions with detailed descriptions. The questions are from SPP logic grid puzzle dataset.
- ''Task Label'': Answer to the given question, which is originally provided by SPP dataset
- ''Response of GPT-4'': Simulated conversations by multiple agents, generated by GPT-4. These responses are also from SPP dataset itself (method-“spp_engine-devgpt4-32k_temp-0.0_topp-1.0_start0-end200__with_sys_mes”).
- ''Prediction of digression by GPT-4'': Binary prediction (yes or no) about the existence of digression within (c) the simulated conversation.
- ''Reasoning of digression by GPT-4'': Reasoning about (d) the prediction of digression.
- ''Classification of digression'': For the simulated conversation predicted to have digression by (d), we further classify the types of digression using GPT-4 again. For the data without digression, this field is provided with ‘N/A’.
- ''Prediction as human-like by GPT-4'': Binary prediction (human or ai) about the likeliness of (c) given conversation as human’s conversation.
- ''Reasoning as human-like by GPT-4'': Reasoning about (g) the prediction as human-like.
- ''Prediction of digression by Human Annotators'': Binary prediction (yes or no) about the existence of digression within (c) the simulated conversation, by three different human annotators.
- ''Prediction as human-like by Human Annotators'': Binary prediction (human or ai) about the likeliness of (c) given conversation as human’s conversation, by three different human annotators.
#### (4) How to access
There are two subsets associated with this data type:
- simulation_roleflip: role-flipping information from CAMEL AI Society dataset
- simulation_digression: digression type information from SPP dataset
Use the example code below to load the digression subset. Change the subset name like this:
#### (5) Qualitative Analysis
Only the subset simulation_digression contains human/GPT annotations for each simulated conversation between agents.
Therefore, please use the following code to view the qualitative analysis part of the simulation section:
#### (6) Others
To get a better prediction and corresponding reasoning for it, please first generate the prediction, and then generate the reasoning as provided in the code.
<hr>
### 5. Free-form Text
#### (1) Dataset Info
Contains Human/Machine texts from source datasets and their classification scores.
If a machine text has a paired human text, the human text's id is associated with the machine texts.
#### (2) Data Sources - License
- Workers vs GPT ('Workers') - MIT
- Human ChatGPT Comparison Corpus ('HC3') - BSD License
- Deepfake Text Detection in the Wild ('Deepfake') - Apache License 2.0
#### (3) Column Info
Human data – 'text', 'label', 'id', 'anger', 'disgust', 'fear', 'joy', 'neutral', 'sadness', 'surprise', 'irony', 'toxicity', 'formality', 'metaphor'
<br>
Machine data – 'text', 'label', 'model', 'strat', 'human_id', 'anger', 'disgust', 'fear', 'joy', 'neutral', 'sadness', 'surprise', 'irony', 'toxicity', 'formality', 'metaphor'
- ''strat'' is the prompting strat; this is relevant for only a subset of the data; 'human_id' is the id of the human text that is its pair, if any
- ''label'' is the label for text classification
- Other attributes are just outputs from classifiers, so not GTs
#### (4) How to access
There are six subsets associated with this data type:
- freeform_deepfake_{(human, machine)}: human/machine outputs from Deepfake dataset
- freeform_hc3_{(human, machine)}: human/machine outputs from HC3 dataset
- freeform_workers_{(human, machine)}: human/machine outputs from Workers dataset
Use the example code below to load the subset of human outputs from deepfake dataset.
#### (5) Qualitative Analysis
To view examples used in the qualitative analysis, please copy and paste the below code:
#### (6) Others
For Discourse artifact analyses, please download the following two pickle files to see the network motifs:
- Network Motiffs (Validation)
- Network Motiffs (Test)
<hr>
## Citation
If you use our paper or this dataset in your research, please cite it as follows:
If you have any questions or feedback, please feel free to reach out at lee03533@URL.
| [
"## Table of Contents\n- Introduction\n- Dataset Structure\n - Task Label\n - Preference\n - Instructions\n - Simulation\n - Free-form Text\n- Citation",
"## Introduction\n\n<div align=\"center\">\n <img src=\"iceberg_modified.png\" style=\"width:50%;height:auto;\" align=\"center\">\n</div>\nWe present a pioneering effort in gathering a diverse range of text data produced by LLMs, covering everything from more structured \"task labels\" to open-ended \"free-form text.\" This comprehensive collection is significant as it allows for a unique and holistic examination of LLM outputs and provides insights into how LLMs perform under varying degrees of structure and freedom, which is essential for both understanding their current state and guiding future improvements and applications. \nWe aggregate and conduct comprehensive stress tests on various data generated by LLMs using the existing benchmarks, offering a thorough evaluation of the quality, consistency, and reliability of LLM outputs across diverse models and scenarios, thereby providing a groundbreaking insight into their strengths and weaknesses for future research and development. \nOur research emphasizes the critical need for responsible and ethical practices in creating and using LLM-generated data, advocating for collaborative efforts among stakeholders to address biases, increase diversity, and deepen the understanding of complex human opinions in LLM outputs, thereby ensuring their development benefits society ethically and sustainably.",
"## Dataset \nThe dataset consists of five different types of LLM-generated data: (1) Task Labels, (2) Preference, (3) Instructions, (4) Simulation, and (5) Free-form Texts.\n\n<hr>",
"### 1. Task Label",
"#### (1) Dataset Info\n\nContains human/machine annotations from source datasets and their majority/minority label aggregations.",
"#### (2) Data Sources - License\n\n- Social Bias Frames (SBIC) - cc-by-4.0\n- GAB Hate Corpus (GHC) - cc-by-4.0 International\n- Age-Related-Sentiment (Sentiment) - cc-by-1.0 Universal\n- Social Chemistry (Schem5Labels) - CC BY-SA 4.0",
"#### (3) Column Info \n\n- ''model_name'': specifies the model that was prompted to generate the model annotations for the text. This can take values: vicuna, baize,llama2, koala, open_ai_gpt35turbo\n- ''dataset_name'': specifies the source dataset of the text. This can take values: SBIC, GHC, Sentiment, and Schem5Labels\n- ''text_ind'': this is the unique index of the text in the complete dataset\n- ''text'': this is the text which the human or machine needs to provide an annotation for\n- ''prompt'': This is the prompt provided to the model for the annotation task\n- ''human_annots'': This consists of the list of annotations generated by human annotators for this task. These are ordinal categorical variables.\n- ''model_annots'': This consists of the list of annotations generated by model annotators for this task. These are ordinal categorical variables. If a value is -1 in this list, it means the model did not return a response for this text. \n- ''human_majority'': this consists of a list containing the majority annotation value(s) among the human-annotated list for that text. \n- ''machine_majority'': this consists of a list containing the majority annotation value(s) among the machine-annotated list for that text. \n- ''human_minority'': this consists of a list containing the minority annotation value(s) among the human-annotated list for that text. \n- ''machine_minority'': this consists of a list containing the minority annotation value(s) among the machine-annotated list for that text.",
"#### (4) How to access\n\nThere is one subset associated with this data type: \n- task_label: intermodel setup with majority/minority opinions aggregated from all data sources\n\nUse the example code below to load the task label split. Change the split name.",
"#### (5) Qualitative Analysis\n\nTo view examples used in the qualitative analysis regarding bias annotations, please copy and paste the below code:",
"#### (6) Others \n\nFor majority/minority calculation, please note the following:\n\n- A list of values that are the majority or minority values in the passed list is returned. For example, if the given input list is [1.0,1.0,2.0,2.0,3.0], then majority value will be [1.0,2.0] and the minority value will be [3.0]\n- If all values in the annotation list are -1, then no valid majority or minority can be calculated. Therefore, None is returned. \n- If all unique values are present in the annotation list, then no valid majority or minority can be calculated. Therefore, None is returned. \n\n\n<hr>",
"### 2. Preference",
"#### (1) Dataset Info\n\nContains Human/Machine Preferences from source datasets and their locality lexicon (for p2c) and entailment (for CoBBLEr) preference.",
"#### (2) Data Sources (License)\n- Prefer to Classify ('p2c')\n - Note that the sentences are originally extracted from DynaSent Round 2\n- CoBBLEr\n - The sentences are originally extracted from Eli5 and BigBench.",
"#### (3) Column Info \n\nCommonly for each row, there are a pair of sentences ('sent_1' and 'sent_2'), with human and machine preferences. \n - Preference Label 0: prefer sent_1\n - Preference Label 1: prefer sent_2\n - Preference Label 2: tie (no preference)\n\nFor p2c dataset, there are the sentiment lexicon-based preference and the difference score between the two sentences in each row. \n\n\n- ''sent_1'': sentence 1 of a pair\n- ''sent_2'': sentence 2 of a pair\n- ''gold_label'': the gold sentiment label of both ''sent_1'' and ''sent_2'' (e.g., positive/negative/neutral)\n- ''human_pref'': human preference\n- ''gpt3_pref'': GPT-3 preference\n- ''lexicon_pref'': the lexicon-based preference between ''sent_1'' and ''sent_2''\n- ''lexicon_diff'': the difference in lexicon scores between sentence pairs\n\n\nFor CoBBLEr dataset, there are textual entailment-based preferences and difference scores between the sentences in each row. \n\n- ''model_1'': the model name that generated sentence 1\n- ''model_2'': the model name that generated sentence 2\n- ''sentence_1'': sentence 1 of a pair\n- ''sentence_2'': sentence 2 of a pair\n- ''human_pref'': human preference\n- ''machine_pref'': LLM preference (GPT-4 or ChatGPT)\n- ''entail_pref'': the entailment-based preference between ''sentence_1'' and ''sentence_2''\n- ''entail_diff'': the difference in entailment scores (computed by RoBERTa-large-MNLI) between two sentences in a pair.",
"#### (4) How to access\n\nThere are three subsets associated with this data type: \n- preference_p2c: p2c data with human and GPT-3 preferences\n- preference_cobbler_gpt4: cobbler data with human and GPT-4 preferences\n- preference_cobbler_chatgpt: cobbler with human and ChatGPT preferences\n\nUse the example code below to load the subset of preference_cobbler_gpt4. Change the subset name.",
"#### (5) Qualitative Analysis\n\nFor ''p2c'' dataset, we release the data with each sentence in a pair annotated with extracted lexicons based on Hayati et al (2021). \nAlso, for several columns in this data, their value consists of a dictionary where each key is the extracted lexicon and its value is the corresponding importance. \nFor example, the column ''sent_{1/2}_anger''is a dictionary of anger-related lexicons with the corresponding importance scores in the (first/second) sentence.\n\nOur study uses the first key with the maximum value score in each lexicon group to decide lexicon-based preferences. \n\n\nTo use this dataset, please note the following:\n\n\nFor sentence pairs of positive sentiment, we used the following columns: \n - ''sent_{1/2}_{joy/politeness}_words'' and \n - ''sent_{1/2}_sentiment_words'' that has values of greater than 0 (positive).\n \nConversely, for the pairs of negative sentiments, we used the following columns: \n - ''sent_{1/2}_{anger/disgust/fear/sad/offensive}_words'',\n - ''sent_{1/2}_polite_words'' that has values of below 0 (rudeness) and \n - ''sent_{1/2}_sentiment_words'' that has values of below 0 (negative).",
"#### (6) Others \n\n\n<hr>",
"### 3. Instructions",
"#### (1) Dataset Info\n\n(1) Human annotations of error types in 800 examples from four different synthetic instruction datasets, and (2) three random samplings of 10k samples for each of the following datasets: Cleaned Alpaca, Dolly, Self Instruct, and Supernatural Instructions. There is a total of 30k samples for each of the datasets (3 seeds each).",
"#### (2) Data Sources (License)\n\n- Unnatural Instructions - MIT\n- Self-Instruct - Apache License 2.0\n- Alpaca-Cleaned - Creative Commons NonCommercial (CC BY-NC 4.0).\n- GPT-4-LLM - Creative Commons NonCommercial (CC BY-NC 4.0).\n- Dolly - Apache License 2.0\n- Supernatural Instructions - Apache License 2.0",
"#### (3) Column Info \n\n(1) Error Annotations \n- ''instruction'': an instruction to follow\n- ''constraints'': samples from the Unnatural Instruction set have an additional data type called ''constraints'', which specify the form the output should take (e.g. 'output should be 'True' or 'False'')\n- ''input'': an input to the corresponding instruction\n- ''output'': the output given the corresponding instruction and the input\n- ''dataset'': the name of the source dataset that the instruction comes from \n- ''QA_type'': the question-answer type (Open-QA or Closed-QA)\n- ''error'': the error type (one of the following: incomprehensible instruction, inconsistent input, inconsistent output, and incorrect output)\n- ''second_error'': sometimes a sample contains more than one error; a second error will be denoted in this column\n- ''third_error'': a third error will be denoted in this column",
"#### (4) How to access\n\n(1) Error Annotations:\n- instruction: first-order experiment setup with error type annotations aggregated from all data sources\n\nUse the example code below to load the instruction subset.",
"#### (5) Qualitative Analysis\n\n\nThe ''instruction'' experiment is based on the manual annotations of each error type found in the synthetic datasets. \nThus, if you want to view examples for qualitative analysis, use the same split information as below:",
"#### (6) Others \n\nFor the second-order experiment,\n\nPlease use this dataset ('instruction_fine-tuning_data.csv').\n\nThe following is the column information:\n\n- ''task_name'': the name of the instruction task. Only pertains to Supernatural Instructions\n- ''id'': the Supernatural Instruction id\n- ''instruction'': an instruction to follow\n- ''input'': an input to the corresponding instruction\n- ''output'': the output given the corresponding instruction and the input\n- ''categories'': the task type. Only pertains to Supernatural Instructions\n- ''source'': the instruction source\n- ''seed'': the seed used for the random sampling. One of the following: 2021, 2022, or 2023\n\n\n<hr>",
"### 4. Simulation",
"#### (1) Dataset Info\n\n\nContains (1) role-flipping information or (2) types of error in digression in simulated agent conversations.",
"#### (2) Data Sources (License)\n\n- CAMEL AI-Society - CC-BY-NC 4.0\n- Solo Performance Prompting Grid-World (SPP) - N/A",
"#### (3) Column Info \n\n(1) Regarding 'CAMEL':\n\n- ''role_flipping_msg_indices'': a list of indices of role-flipped messages in the conversation\n- ''interruption_msg_indices'': a list of indices of interruption messages in the conversation\n- ''role_flipping_happens'': boolean true when role_flipping_msg_indices is not empty\n\n\n(2) Regarding 'SPP':\n\n- ''Given Task'': Given questions with detailed descriptions. The questions are from SPP logic grid puzzle dataset.\n- ''Task Label'': Answer to the given question, which is originally provided by SPP dataset\n- ''Response of GPT-4'': Simulated conversations by multiple agents, generated by GPT-4. These responses are also from SPP dataset itself (method-“spp_engine-devgpt4-32k_temp-0.0_topp-1.0_start0-end200__with_sys_mes”).\n- ''Prediction of digression by GPT-4'': Binary prediction (yes or no) about the existence of digression within (c) the simulated conversation.\n- ''Reasoning of digression by GPT-4'': Reasoning about (d) the prediction of digression.\n- ''Classification of digression'': For the simulated conversation predicted to have digression by (d), we further classify the types of digression using GPT-4 again. For the data without digression, this field is provided with ‘N/A’.\n- ''Prediction as human-like by GPT-4'': Binary prediction (human or ai) about the likeliness of (c) given conversation as human’s conversation.\t\n- ''Reasoning as human-like by GPT-4'': Reasoning about (g) the prediction as human-like.\n- ''Prediction of digression by Human Annotators'': Binary prediction (yes or no) about the existence of digression within (c) the simulated conversation, by three different human annotators. \t\n- ''Prediction as human-like by Human Annotators'': Binary prediction (human or ai) about the likeliness of (c) given conversation as human’s conversation, by three different human annotators.",
"#### (4) How to access\n\nThere are two subsets associated with this data type: \n- simulation_roleflip: role-flipping information from CAMEL AI Society dataset\n- simulation_digression: digression type information from SPP dataset\n \nUse the example code below to load the digression subset. Change the subset name like this:",
"#### (5) Qualitative Analysis\n\nOnly the subset simulation_digression contains human/GPT annotations for each simulated conversation between agents. \nTherefore, please use the following code to view the qualitative analysis part of the simulation section:",
"#### (6) Others \n\nTo get a better prediction and corresponding reasoning for it, please first generate the prediction, and then generate the reasoning as provided in the code.\n\n<hr>",
"### 5. Free-form Text",
"#### (1) Dataset Info\n\nContains Human/Machine texts from source datasets and their classification scores. \nIf a machine text has a paired human text, the human text's id is associated with the machine texts.",
"#### (2) Data Sources - License\n- Workers vs GPT ('Workers') - MIT\n- Human ChatGPT Comparison Corpus ('HC3') - BSD License\n- Deepfake Text Detection in the Wild ('Deepfake') - Apache License 2.0",
"#### (3) Column Info \n\nHuman data – 'text', 'label', 'id', 'anger', 'disgust', 'fear', 'joy', 'neutral', 'sadness', 'surprise', 'irony', 'toxicity', 'formality', 'metaphor'\n<br>\nMachine data – 'text', 'label', 'model', 'strat', 'human_id', 'anger', 'disgust', 'fear', 'joy', 'neutral', 'sadness', 'surprise', 'irony', 'toxicity', 'formality', 'metaphor' \n\n- ''strat'' is the prompting strat; this is relevant for only a subset of the data; 'human_id' is the id of the human text that is its pair, if any\n- ''label'' is the label for text classification\n- Other attributes are just outputs from classifiers, so not GTs",
"#### (4) How to access\n\nThere are six subsets associated with this data type: \n- freeform_deepfake_{(human, machine)}: human/machine outputs from Deepfake dataset\n- freeform_hc3_{(human, machine)}: human/machine outputs from HC3 dataset\n- freeform_workers_{(human, machine)}: human/machine outputs from Workers dataset\n\nUse the example code below to load the subset of human outputs from deepfake dataset.",
"#### (5) Qualitative Analysis\n\nTo view examples used in the qualitative analysis, please copy and paste the below code:",
"#### (6) Others \n\nFor Discourse artifact analyses, please download the following two pickle files to see the network motifs: \n\n- Network Motiffs (Validation)\n- Network Motiffs (Test)\n\n\n\n<hr>",
"## Citation\n\n\nIf you use our paper or this dataset in your research, please cite it as follows:\n\n\n\nIf you have any questions or feedback, please feel free to reach out at lee03533@URL."
] | [
"TAGS\n#arxiv-2401.14698 #arxiv-2306.04925 #region-us \n",
"## Table of Contents\n- Introduction\n- Dataset Structure\n - Task Label\n - Preference\n - Instructions\n - Simulation\n - Free-form Text\n- Citation",
"## Introduction\n\n<div align=\"center\">\n <img src=\"iceberg_modified.png\" style=\"width:50%;height:auto;\" align=\"center\">\n</div>\nWe present a pioneering effort in gathering a diverse range of text data produced by LLMs, covering everything from more structured \"task labels\" to open-ended \"free-form text.\" This comprehensive collection is significant as it allows for a unique and holistic examination of LLM outputs and provides insights into how LLMs perform under varying degrees of structure and freedom, which is essential for both understanding their current state and guiding future improvements and applications. \nWe aggregate and conduct comprehensive stress tests on various data generated by LLMs using the existing benchmarks, offering a thorough evaluation of the quality, consistency, and reliability of LLM outputs across diverse models and scenarios, thereby providing a groundbreaking insight into their strengths and weaknesses for future research and development. \nOur research emphasizes the critical need for responsible and ethical practices in creating and using LLM-generated data, advocating for collaborative efforts among stakeholders to address biases, increase diversity, and deepen the understanding of complex human opinions in LLM outputs, thereby ensuring their development benefits society ethically and sustainably.",
"## Dataset \nThe dataset consists of five different types of LLM-generated data: (1) Task Labels, (2) Preference, (3) Instructions, (4) Simulation, and (5) Free-form Texts.\n\n<hr>",
"### 1. Task Label",
"#### (1) Dataset Info\n\nContains human/machine annotations from source datasets and their majority/minority label aggregations.",
"#### (2) Data Sources - License\n\n- Social Bias Frames (SBIC) - cc-by-4.0\n- GAB Hate Corpus (GHC) - cc-by-4.0 International\n- Age-Related-Sentiment (Sentiment) - cc-by-1.0 Universal\n- Social Chemistry (Schem5Labels) - CC BY-SA 4.0",
"#### (3) Column Info \n\n- ''model_name'': specifies the model that was prompted to generate the model annotations for the text. This can take values: vicuna, baize,llama2, koala, open_ai_gpt35turbo\n- ''dataset_name'': specifies the source dataset of the text. This can take values: SBIC, GHC, Sentiment, and Schem5Labels\n- ''text_ind'': this is the unique index of the text in the complete dataset\n- ''text'': this is the text which the human or machine needs to provide an annotation for\n- ''prompt'': This is the prompt provided to the model for the annotation task\n- ''human_annots'': This consists of the list of annotations generated by human annotators for this task. These are ordinal categorical variables.\n- ''model_annots'': This consists of the list of annotations generated by model annotators for this task. These are ordinal categorical variables. If a value is -1 in this list, it means the model did not return a response for this text. \n- ''human_majority'': this consists of a list containing the majority annotation value(s) among the human-annotated list for that text. \n- ''machine_majority'': this consists of a list containing the majority annotation value(s) among the machine-annotated list for that text. \n- ''human_minority'': this consists of a list containing the minority annotation value(s) among the human-annotated list for that text. \n- ''machine_minority'': this consists of a list containing the minority annotation value(s) among the machine-annotated list for that text.",
"#### (4) How to access\n\nThere is one subset associated with this data type: \n- task_label: intermodel setup with majority/minority opinions aggregated from all data sources\n\nUse the example code below to load the task label split. Change the split name.",
"#### (5) Qualitative Analysis\n\nTo view examples used in the qualitative analysis regarding bias annotations, please copy and paste the below code:",
"#### (6) Others \n\nFor majority/minority calculation, please note the following:\n\n- A list of values that are the majority or minority values in the passed list is returned. For example, if the given input list is [1.0,1.0,2.0,2.0,3.0], then majority value will be [1.0,2.0] and the minority value will be [3.0]\n- If all values in the annotation list are -1, then no valid majority or minority can be calculated. Therefore, None is returned. \n- If all unique values are present in the annotation list, then no valid majority or minority can be calculated. Therefore, None is returned. \n\n\n<hr>",
"### 2. Preference",
"#### (1) Dataset Info\n\nContains Human/Machine Preferences from source datasets and their locality lexicon (for p2c) and entailment (for CoBBLEr) preference.",
"#### (2) Data Sources (License)\n- Prefer to Classify ('p2c')\n - Note that the sentences are originally extracted from DynaSent Round 2\n- CoBBLEr\n - The sentences are originally extracted from Eli5 and BigBench.",
"#### (3) Column Info \n\nCommonly for each row, there are a pair of sentences ('sent_1' and 'sent_2'), with human and machine preferences. \n - Preference Label 0: prefer sent_1\n - Preference Label 1: prefer sent_2\n - Preference Label 2: tie (no preference)\n\nFor p2c dataset, there are the sentiment lexicon-based preference and the difference score between the two sentences in each row. \n\n\n- ''sent_1'': sentence 1 of a pair\n- ''sent_2'': sentence 2 of a pair\n- ''gold_label'': the gold sentiment label of both ''sent_1'' and ''sent_2'' (e.g., positive/negative/neutral)\n- ''human_pref'': human preference\n- ''gpt3_pref'': GPT-3 preference\n- ''lexicon_pref'': the lexicon-based preference between ''sent_1'' and ''sent_2''\n- ''lexicon_diff'': the difference in lexicon scores between sentence pairs\n\n\nFor CoBBLEr dataset, there are textual entailment-based preferences and difference scores between the sentences in each row. \n\n- ''model_1'': the model name that generated sentence 1\n- ''model_2'': the model name that generated sentence 2\n- ''sentence_1'': sentence 1 of a pair\n- ''sentence_2'': sentence 2 of a pair\n- ''human_pref'': human preference\n- ''machine_pref'': LLM preference (GPT-4 or ChatGPT)\n- ''entail_pref'': the entailment-based preference between ''sentence_1'' and ''sentence_2''\n- ''entail_diff'': the difference in entailment scores (computed by RoBERTa-large-MNLI) between two sentences in a pair.",
"#### (4) How to access\n\nThere are three subsets associated with this data type: \n- preference_p2c: p2c data with human and GPT-3 preferences\n- preference_cobbler_gpt4: cobbler data with human and GPT-4 preferences\n- preference_cobbler_chatgpt: cobbler with human and ChatGPT preferences\n\nUse the example code below to load the subset of preference_cobbler_gpt4. Change the subset name.",
"#### (5) Qualitative Analysis\n\nFor ''p2c'' dataset, we release the data with each sentence in a pair annotated with extracted lexicons based on Hayati et al (2021). \nAlso, for several columns in this data, their value consists of a dictionary where each key is the extracted lexicon and its value is the corresponding importance. \nFor example, the column ''sent_{1/2}_anger''is a dictionary of anger-related lexicons with the corresponding importance scores in the (first/second) sentence.\n\nOur study uses the first key with the maximum value score in each lexicon group to decide lexicon-based preferences. \n\n\nTo use this dataset, please note the following:\n\n\nFor sentence pairs of positive sentiment, we used the following columns: \n - ''sent_{1/2}_{joy/politeness}_words'' and \n - ''sent_{1/2}_sentiment_words'' that has values of greater than 0 (positive).\n \nConversely, for the pairs of negative sentiments, we used the following columns: \n - ''sent_{1/2}_{anger/disgust/fear/sad/offensive}_words'',\n - ''sent_{1/2}_polite_words'' that has values of below 0 (rudeness) and \n - ''sent_{1/2}_sentiment_words'' that has values of below 0 (negative).",
"#### (6) Others \n\n\n<hr>",
"### 3. Instructions",
"#### (1) Dataset Info\n\n(1) Human annotations of error types in 800 examples from four different synthetic instruction datasets, and (2) three random samplings of 10k samples for each of the following datasets: Cleaned Alpaca, Dolly, Self Instruct, and Supernatural Instructions. There is a total of 30k samples for each of the datasets (3 seeds each).",
"#### (2) Data Sources (License)\n\n- Unnatural Instructions - MIT\n- Self-Instruct - Apache License 2.0\n- Alpaca-Cleaned - Creative Commons NonCommercial (CC BY-NC 4.0).\n- GPT-4-LLM - Creative Commons NonCommercial (CC BY-NC 4.0).\n- Dolly - Apache License 2.0\n- Supernatural Instructions - Apache License 2.0",
"#### (3) Column Info \n\n(1) Error Annotations \n- ''instruction'': an instruction to follow\n- ''constraints'': samples from the Unnatural Instruction set have an additional data type called ''constraints'', which specify the form the output should take (e.g. 'output should be 'True' or 'False'')\n- ''input'': an input to the corresponding instruction\n- ''output'': the output given the corresponding instruction and the input\n- ''dataset'': the name of the source dataset that the instruction comes from \n- ''QA_type'': the question-answer type (Open-QA or Closed-QA)\n- ''error'': the error type (one of the following: incomprehensible instruction, inconsistent input, inconsistent output, and incorrect output)\n- ''second_error'': sometimes a sample contains more than one error; a second error will be denoted in this column\n- ''third_error'': a third error will be denoted in this column",
"#### (4) How to access\n\n(1) Error Annotations:\n- instruction: first-order experiment setup with error type annotations aggregated from all data sources\n\nUse the example code below to load the instruction subset.",
"#### (5) Qualitative Analysis\n\n\nThe ''instruction'' experiment is based on the manual annotations of each error type found in the synthetic datasets. \nThus, if you want to view examples for qualitative analysis, use the same split information as below:",
"#### (6) Others \n\nFor the second-order experiment,\n\nPlease use this dataset ('instruction_fine-tuning_data.csv').\n\nThe following is the column information:\n\n- ''task_name'': the name of the instruction task. Only pertains to Supernatural Instructions\n- ''id'': the Supernatural Instruction id\n- ''instruction'': an instruction to follow\n- ''input'': an input to the corresponding instruction\n- ''output'': the output given the corresponding instruction and the input\n- ''categories'': the task type. Only pertains to Supernatural Instructions\n- ''source'': the instruction source\n- ''seed'': the seed used for the random sampling. One of the following: 2021, 2022, or 2023\n\n\n<hr>",
"### 4. Simulation",
"#### (1) Dataset Info\n\n\nContains (1) role-flipping information or (2) types of error in digression in simulated agent conversations.",
"#### (2) Data Sources (License)\n\n- CAMEL AI-Society - CC-BY-NC 4.0\n- Solo Performance Prompting Grid-World (SPP) - N/A",
"#### (3) Column Info \n\n(1) Regarding 'CAMEL':\n\n- ''role_flipping_msg_indices'': a list of indices of role-flipped messages in the conversation\n- ''interruption_msg_indices'': a list of indices of interruption messages in the conversation\n- ''role_flipping_happens'': boolean true when role_flipping_msg_indices is not empty\n\n\n(2) Regarding 'SPP':\n\n- ''Given Task'': Given questions with detailed descriptions. The questions are from SPP logic grid puzzle dataset.\n- ''Task Label'': Answer to the given question, which is originally provided by SPP dataset\n- ''Response of GPT-4'': Simulated conversations by multiple agents, generated by GPT-4. These responses are also from SPP dataset itself (method-“spp_engine-devgpt4-32k_temp-0.0_topp-1.0_start0-end200__with_sys_mes”).\n- ''Prediction of digression by GPT-4'': Binary prediction (yes or no) about the existence of digression within (c) the simulated conversation.\n- ''Reasoning of digression by GPT-4'': Reasoning about (d) the prediction of digression.\n- ''Classification of digression'': For the simulated conversation predicted to have digression by (d), we further classify the types of digression using GPT-4 again. For the data without digression, this field is provided with ‘N/A’.\n- ''Prediction as human-like by GPT-4'': Binary prediction (human or ai) about the likeliness of (c) given conversation as human’s conversation.\t\n- ''Reasoning as human-like by GPT-4'': Reasoning about (g) the prediction as human-like.\n- ''Prediction of digression by Human Annotators'': Binary prediction (yes or no) about the existence of digression within (c) the simulated conversation, by three different human annotators. \t\n- ''Prediction as human-like by Human Annotators'': Binary prediction (human or ai) about the likeliness of (c) given conversation as human’s conversation, by three different human annotators.",
"#### (4) How to access\n\nThere are two subsets associated with this data type: \n- simulation_roleflip: role-flipping information from CAMEL AI Society dataset\n- simulation_digression: digression type information from SPP dataset\n \nUse the example code below to load the digression subset. Change the subset name like this:",
"#### (5) Qualitative Analysis\n\nOnly the subset simulation_digression contains human/GPT annotations for each simulated conversation between agents. \nTherefore, please use the following code to view the qualitative analysis part of the simulation section:",
"#### (6) Others \n\nTo get a better prediction and corresponding reasoning for it, please first generate the prediction, and then generate the reasoning as provided in the code.\n\n<hr>",
"### 5. Free-form Text",
"#### (1) Dataset Info\n\nContains Human/Machine texts from source datasets and their classification scores. \nIf a machine text has a paired human text, the human text's id is associated with the machine texts.",
"#### (2) Data Sources - License\n- Workers vs GPT ('Workers') - MIT\n- Human ChatGPT Comparison Corpus ('HC3') - BSD License\n- Deepfake Text Detection in the Wild ('Deepfake') - Apache License 2.0",
"#### (3) Column Info \n\nHuman data – 'text', 'label', 'id', 'anger', 'disgust', 'fear', 'joy', 'neutral', 'sadness', 'surprise', 'irony', 'toxicity', 'formality', 'metaphor'\n<br>\nMachine data – 'text', 'label', 'model', 'strat', 'human_id', 'anger', 'disgust', 'fear', 'joy', 'neutral', 'sadness', 'surprise', 'irony', 'toxicity', 'formality', 'metaphor' \n\n- ''strat'' is the prompting strat; this is relevant for only a subset of the data; 'human_id' is the id of the human text that is its pair, if any\n- ''label'' is the label for text classification\n- Other attributes are just outputs from classifiers, so not GTs",
"#### (4) How to access\n\nThere are six subsets associated with this data type: \n- freeform_deepfake_{(human, machine)}: human/machine outputs from Deepfake dataset\n- freeform_hc3_{(human, machine)}: human/machine outputs from HC3 dataset\n- freeform_workers_{(human, machine)}: human/machine outputs from Workers dataset\n\nUse the example code below to load the subset of human outputs from deepfake dataset.",
"#### (5) Qualitative Analysis\n\nTo view examples used in the qualitative analysis, please copy and paste the below code:",
"#### (6) Others \n\nFor Discourse artifact analyses, please download the following two pickle files to see the network motifs: \n\n- Network Motiffs (Validation)\n- Network Motiffs (Test)\n\n\n\n<hr>",
"## Citation\n\n\nIf you use our paper or this dataset in your research, please cite it as follows:\n\n\n\nIf you have any questions or feedback, please feel free to reach out at lee03533@URL."
] | [
24,
36,
304,
49,
6,
31,
83,
406,
56,
33,
146,
5,
45,
61,
413,
109,
327,
8,
6,
92,
89,
241,
47,
58,
179,
5,
29,
43,
524,
75,
54,
41,
7,
51,
63,
219,
117,
27,
50,
46
] | [
"passage: TAGS\n#arxiv-2401.14698 #arxiv-2306.04925 #region-us \n## Table of Contents\n- Introduction\n- Dataset Structure\n - Task Label\n - Preference\n - Instructions\n - Simulation\n - Free-form Text\n- Citation## Introduction\n\n<div align=\"center\">\n <img src=\"iceberg_modified.png\" style=\"width:50%;height:auto;\" align=\"center\">\n</div>\nWe present a pioneering effort in gathering a diverse range of text data produced by LLMs, covering everything from more structured \"task labels\" to open-ended \"free-form text.\" This comprehensive collection is significant as it allows for a unique and holistic examination of LLM outputs and provides insights into how LLMs perform under varying degrees of structure and freedom, which is essential for both understanding their current state and guiding future improvements and applications. \nWe aggregate and conduct comprehensive stress tests on various data generated by LLMs using the existing benchmarks, offering a thorough evaluation of the quality, consistency, and reliability of LLM outputs across diverse models and scenarios, thereby providing a groundbreaking insight into their strengths and weaknesses for future research and development. \nOur research emphasizes the critical need for responsible and ethical practices in creating and using LLM-generated data, advocating for collaborative efforts among stakeholders to address biases, increase diversity, and deepen the understanding of complex human opinions in LLM outputs, thereby ensuring their development benefits society ethically and sustainably.## Dataset \nThe dataset consists of five different types of LLM-generated data: (1) Task Labels, (2) Preference, (3) Instructions, (4) Simulation, and (5) Free-form Texts.\n\n<hr>### 1. Task Label#### (1) Dataset Info\n\nContains human/machine annotations from source datasets and their majority/minority label aggregations.",
"passage: #### (2) Data Sources - License\n\n- Social Bias Frames (SBIC) - cc-by-4.0\n- GAB Hate Corpus (GHC) - cc-by-4.0 International\n- Age-Related-Sentiment (Sentiment) - cc-by-1.0 Universal\n- Social Chemistry (Schem5Labels) - CC BY-SA 4.0#### (3) Column Info \n\n- ''model_name'': specifies the model that was prompted to generate the model annotations for the text. This can take values: vicuna, baize,llama2, koala, open_ai_gpt35turbo\n- ''dataset_name'': specifies the source dataset of the text. This can take values: SBIC, GHC, Sentiment, and Schem5Labels\n- ''text_ind'': this is the unique index of the text in the complete dataset\n- ''text'': this is the text which the human or machine needs to provide an annotation for\n- ''prompt'': This is the prompt provided to the model for the annotation task\n- ''human_annots'': This consists of the list of annotations generated by human annotators for this task. These are ordinal categorical variables.\n- ''model_annots'': This consists of the list of annotations generated by model annotators for this task. These are ordinal categorical variables. If a value is -1 in this list, it means the model did not return a response for this text. \n- ''human_majority'': this consists of a list containing the majority annotation value(s) among the human-annotated list for that text. \n- ''machine_majority'': this consists of a list containing the majority annotation value(s) among the machine-annotated list for that text. \n- ''human_minority'': this consists of a list containing the minority annotation value(s) among the human-annotated list for that text. \n- ''machine_minority'': this consists of a list containing the minority annotation value(s) among the machine-annotated list for that text.#### (4) How to access\n\nThere is one subset associated with this data type: \n- task_label: intermodel setup with majority/minority opinions aggregated from all data sources\n\nUse the example code below to load the task label split. Change the split name.#### (5) Qualitative Analysis\n\nTo view examples used in the qualitative analysis regarding bias annotations, please copy and paste the below code:",
"passage: #### (6) Others \n\nFor majority/minority calculation, please note the following:\n\n- A list of values that are the majority or minority values in the passed list is returned. For example, if the given input list is [1.0,1.0,2.0,2.0,3.0], then majority value will be [1.0,2.0] and the minority value will be [3.0]\n- If all values in the annotation list are -1, then no valid majority or minority can be calculated. Therefore, None is returned. \n- If all unique values are present in the annotation list, then no valid majority or minority can be calculated. Therefore, None is returned. \n\n\n<hr>### 2. Preference#### (1) Dataset Info\n\nContains Human/Machine Preferences from source datasets and their locality lexicon (for p2c) and entailment (for CoBBLEr) preference.#### (2) Data Sources (License)\n- Prefer to Classify ('p2c')\n - Note that the sentences are originally extracted from DynaSent Round 2\n- CoBBLEr\n - The sentences are originally extracted from Eli5 and BigBench.",
"passage: #### (3) Column Info \n\nCommonly for each row, there are a pair of sentences ('sent_1' and 'sent_2'), with human and machine preferences. \n - Preference Label 0: prefer sent_1\n - Preference Label 1: prefer sent_2\n - Preference Label 2: tie (no preference)\n\nFor p2c dataset, there are the sentiment lexicon-based preference and the difference score between the two sentences in each row. \n\n\n- ''sent_1'': sentence 1 of a pair\n- ''sent_2'': sentence 2 of a pair\n- ''gold_label'': the gold sentiment label of both ''sent_1'' and ''sent_2'' (e.g., positive/negative/neutral)\n- ''human_pref'': human preference\n- ''gpt3_pref'': GPT-3 preference\n- ''lexicon_pref'': the lexicon-based preference between ''sent_1'' and ''sent_2''\n- ''lexicon_diff'': the difference in lexicon scores between sentence pairs\n\n\nFor CoBBLEr dataset, there are textual entailment-based preferences and difference scores between the sentences in each row. \n\n- ''model_1'': the model name that generated sentence 1\n- ''model_2'': the model name that generated sentence 2\n- ''sentence_1'': sentence 1 of a pair\n- ''sentence_2'': sentence 2 of a pair\n- ''human_pref'': human preference\n- ''machine_pref'': LLM preference (GPT-4 or ChatGPT)\n- ''entail_pref'': the entailment-based preference between ''sentence_1'' and ''sentence_2''\n- ''entail_diff'': the difference in entailment scores (computed by RoBERTa-large-MNLI) between two sentences in a pair.#### (4) How to access\n\nThere are three subsets associated with this data type: \n- preference_p2c: p2c data with human and GPT-3 preferences\n- preference_cobbler_gpt4: cobbler data with human and GPT-4 preferences\n- preference_cobbler_chatgpt: cobbler with human and ChatGPT preferences\n\nUse the example code below to load the subset of preference_cobbler_gpt4. Change the subset name.#### (5) Qualitative Analysis\n\nFor ''p2c'' dataset, we release the data with each sentence in a pair annotated with extracted lexicons based on Hayati et al (2021). \nAlso, for several columns in this data, their value consists of a dictionary where each key is the extracted lexicon and its value is the corresponding importance. \nFor example, the column ''sent_{1/2}_anger''is a dictionary of anger-related lexicons with the corresponding importance scores in the (first/second) sentence.\n\nOur study uses the first key with the maximum value score in each lexicon group to decide lexicon-based preferences. \n\n\nTo use this dataset, please note the following:\n\n\nFor sentence pairs of positive sentiment, we used the following columns: \n - ''sent_{1/2}_{joy/politeness}_words'' and \n - ''sent_{1/2}_sentiment_words'' that has values of greater than 0 (positive).\n \nConversely, for the pairs of negative sentiments, we used the following columns: \n - ''sent_{1/2}_{anger/disgust/fear/sad/offensive}_words'',\n - ''sent_{1/2}_polite_words'' that has values of below 0 (rudeness) and \n - ''sent_{1/2}_sentiment_words'' that has values of below 0 (negative).#### (6) Others \n\n\n<hr>### 3. Instructions",
"passage: #### (1) Dataset Info\n\n(1) Human annotations of error types in 800 examples from four different synthetic instruction datasets, and (2) three random samplings of 10k samples for each of the following datasets: Cleaned Alpaca, Dolly, Self Instruct, and Supernatural Instructions. There is a total of 30k samples for each of the datasets (3 seeds each).#### (2) Data Sources (License)\n\n- Unnatural Instructions - MIT\n- Self-Instruct - Apache License 2.0\n- Alpaca-Cleaned - Creative Commons NonCommercial (CC BY-NC 4.0).\n- GPT-4-LLM - Creative Commons NonCommercial (CC BY-NC 4.0).\n- Dolly - Apache License 2.0\n- Supernatural Instructions - Apache License 2.0#### (3) Column Info \n\n(1) Error Annotations \n- ''instruction'': an instruction to follow\n- ''constraints'': samples from the Unnatural Instruction set have an additional data type called ''constraints'', which specify the form the output should take (e.g. 'output should be 'True' or 'False'')\n- ''input'': an input to the corresponding instruction\n- ''output'': the output given the corresponding instruction and the input\n- ''dataset'': the name of the source dataset that the instruction comes from \n- ''QA_type'': the question-answer type (Open-QA or Closed-QA)\n- ''error'': the error type (one of the following: incomprehensible instruction, inconsistent input, inconsistent output, and incorrect output)\n- ''second_error'': sometimes a sample contains more than one error; a second error will be denoted in this column\n- ''third_error'': a third error will be denoted in this column#### (4) How to access\n\n(1) Error Annotations:\n- instruction: first-order experiment setup with error type annotations aggregated from all data sources\n\nUse the example code below to load the instruction subset.#### (5) Qualitative Analysis\n\n\nThe ''instruction'' experiment is based on the manual annotations of each error type found in the synthetic datasets. \nThus, if you want to view examples for qualitative analysis, use the same split information as below:",
"passage: #### (6) Others \n\nFor the second-order experiment,\n\nPlease use this dataset ('instruction_fine-tuning_data.csv').\n\nThe following is the column information:\n\n- ''task_name'': the name of the instruction task. Only pertains to Supernatural Instructions\n- ''id'': the Supernatural Instruction id\n- ''instruction'': an instruction to follow\n- ''input'': an input to the corresponding instruction\n- ''output'': the output given the corresponding instruction and the input\n- ''categories'': the task type. Only pertains to Supernatural Instructions\n- ''source'': the instruction source\n- ''seed'': the seed used for the random sampling. One of the following: 2021, 2022, or 2023\n\n\n<hr>### 4. Simulation#### (1) Dataset Info\n\n\nContains (1) role-flipping information or (2) types of error in digression in simulated agent conversations.#### (2) Data Sources (License)\n\n- CAMEL AI-Society - CC-BY-NC 4.0\n- Solo Performance Prompting Grid-World (SPP) - N/A",
"passage: #### (3) Column Info \n\n(1) Regarding 'CAMEL':\n\n- ''role_flipping_msg_indices'': a list of indices of role-flipped messages in the conversation\n- ''interruption_msg_indices'': a list of indices of interruption messages in the conversation\n- ''role_flipping_happens'': boolean true when role_flipping_msg_indices is not empty\n\n\n(2) Regarding 'SPP':\n\n- ''Given Task'': Given questions with detailed descriptions. The questions are from SPP logic grid puzzle dataset.\n- ''Task Label'': Answer to the given question, which is originally provided by SPP dataset\n- ''Response of GPT-4'': Simulated conversations by multiple agents, generated by GPT-4. These responses are also from SPP dataset itself (method-“spp_engine-devgpt4-32k_temp-0.0_topp-1.0_start0-end200__with_sys_mes”).\n- ''Prediction of digression by GPT-4'': Binary prediction (yes or no) about the existence of digression within (c) the simulated conversation.\n- ''Reasoning of digression by GPT-4'': Reasoning about (d) the prediction of digression.\n- ''Classification of digression'': For the simulated conversation predicted to have digression by (d), we further classify the types of digression using GPT-4 again. For the data without digression, this field is provided with ‘N/A’.\n- ''Prediction as human-like by GPT-4'': Binary prediction (human or ai) about the likeliness of (c) given conversation as human’s conversation.\t\n- ''Reasoning as human-like by GPT-4'': Reasoning about (g) the prediction as human-like.\n- ''Prediction of digression by Human Annotators'': Binary prediction (yes or no) about the existence of digression within (c) the simulated conversation, by three different human annotators. \t\n- ''Prediction as human-like by Human Annotators'': Binary prediction (human or ai) about the likeliness of (c) given conversation as human’s conversation, by three different human annotators.#### (4) How to access\n\nThere are two subsets associated with this data type: \n- simulation_roleflip: role-flipping information from CAMEL AI Society dataset\n- simulation_digression: digression type information from SPP dataset\n \nUse the example code below to load the digression subset. Change the subset name like this:#### (5) Qualitative Analysis\n\nOnly the subset simulation_digression contains human/GPT annotations for each simulated conversation between agents. \nTherefore, please use the following code to view the qualitative analysis part of the simulation section:#### (6) Others \n\nTo get a better prediction and corresponding reasoning for it, please first generate the prediction, and then generate the reasoning as provided in the code.\n\n<hr>### 5. Free-form Text#### (1) Dataset Info\n\nContains Human/Machine texts from source datasets and their classification scores. \nIf a machine text has a paired human text, the human text's id is associated with the machine texts.#### (2) Data Sources - License\n- Workers vs GPT ('Workers') - MIT\n- Human ChatGPT Comparison Corpus ('HC3') - BSD License\n- Deepfake Text Detection in the Wild ('Deepfake') - Apache License 2.0"
] |
11387f3e35e1a498147ae924f3fd93b814faf511 | # Common Voice **Filtered**
A filtered subset of the Common Voice dataset. Currently, this dataset only includes a small subset of English speech.
We only include speech ranked above 3.75 (75%) on the MOS metric, as calculated by the UTMOS system. Approximately 7% of audio qualified for inclusion in this filtered dataset.
This data is not final. Processing the whole Common Voice dataset would require a significant amount of compute, this is just a small sample/MVP of the project.
The code is available on GitHub.
## **Transcriptions**
The transcriptions can be found in the original Common Voice dataset [here](https://huggingface.co/datasets/mozilla-foundation/common_voice_16_0/blob/main/transcript/en/train.tsv).
## **Uses**
Because this dataset is limited to higher-quality audio clips, this dataset is especially suitable for text-to-speech systems, however not as suitable for ASR (speech recognition) systems because it excludes lower-quality audio.
## **Length**
Unfortunately, most phrases in this dataset are limited to 5 seconds. | styletts2-community/common-voice-filtered | [
"task_categories:text-to-speech",
"size_categories:n<1K",
"license:cc-by-sa-4.0",
"common-voice",
"region:us"
] | 2023-12-24T03:13:06+00:00 | {"license": "cc-by-sa-4.0", "size_categories": ["n<1K"], "task_categories": ["text-to-speech"], "pretty_name": "Common Voice Filtered", "tags": ["common-voice"]} | 2023-12-24T03:17:47+00:00 | [] | [] | TAGS
#task_categories-text-to-speech #size_categories-n<1K #license-cc-by-sa-4.0 #common-voice #region-us
| # Common Voice Filtered
A filtered subset of the Common Voice dataset. Currently, this dataset only includes a small subset of English speech.
We only include speech ranked above 3.75 (75%) on the MOS metric, as calculated by the UTMOS system. Approximately 7% of audio qualified for inclusion in this filtered dataset.
This data is not final. Processing the whole Common Voice dataset would require a significant amount of compute, this is just a small sample/MVP of the project.
The code is available on GitHub.
## Transcriptions
The transcriptions can be found in the original Common Voice dataset here.
## Uses
Because this dataset is limited to higher-quality audio clips, this dataset is especially suitable for text-to-speech systems, however not as suitable for ASR (speech recognition) systems because it excludes lower-quality audio.
## Length
Unfortunately, most phrases in this dataset are limited to 5 seconds. | [
"# Common Voice Filtered\n\nA filtered subset of the Common Voice dataset. Currently, this dataset only includes a small subset of English speech.\n\nWe only include speech ranked above 3.75 (75%) on the MOS metric, as calculated by the UTMOS system. Approximately 7% of audio qualified for inclusion in this filtered dataset.\n\nThis data is not final. Processing the whole Common Voice dataset would require a significant amount of compute, this is just a small sample/MVP of the project.\n\nThe code is available on GitHub.",
"## Transcriptions\n\nThe transcriptions can be found in the original Common Voice dataset here.",
"## Uses\n\nBecause this dataset is limited to higher-quality audio clips, this dataset is especially suitable for text-to-speech systems, however not as suitable for ASR (speech recognition) systems because it excludes lower-quality audio.",
"## Length\n\nUnfortunately, most phrases in this dataset are limited to 5 seconds."
] | [
"TAGS\n#task_categories-text-to-speech #size_categories-n<1K #license-cc-by-sa-4.0 #common-voice #region-us \n",
"# Common Voice Filtered\n\nA filtered subset of the Common Voice dataset. Currently, this dataset only includes a small subset of English speech.\n\nWe only include speech ranked above 3.75 (75%) on the MOS metric, as calculated by the UTMOS system. Approximately 7% of audio qualified for inclusion in this filtered dataset.\n\nThis data is not final. Processing the whole Common Voice dataset would require a significant amount of compute, this is just a small sample/MVP of the project.\n\nThe code is available on GitHub.",
"## Transcriptions\n\nThe transcriptions can be found in the original Common Voice dataset here.",
"## Uses\n\nBecause this dataset is limited to higher-quality audio clips, this dataset is especially suitable for text-to-speech systems, however not as suitable for ASR (speech recognition) systems because it excludes lower-quality audio.",
"## Length\n\nUnfortunately, most phrases in this dataset are limited to 5 seconds."
] | [
46,
124,
20,
54,
18
] | [
"passage: TAGS\n#task_categories-text-to-speech #size_categories-n<1K #license-cc-by-sa-4.0 #common-voice #region-us \n# Common Voice Filtered\n\nA filtered subset of the Common Voice dataset. Currently, this dataset only includes a small subset of English speech.\n\nWe only include speech ranked above 3.75 (75%) on the MOS metric, as calculated by the UTMOS system. Approximately 7% of audio qualified for inclusion in this filtered dataset.\n\nThis data is not final. Processing the whole Common Voice dataset would require a significant amount of compute, this is just a small sample/MVP of the project.\n\nThe code is available on GitHub.## Transcriptions\n\nThe transcriptions can be found in the original Common Voice dataset here.## Uses\n\nBecause this dataset is limited to higher-quality audio clips, this dataset is especially suitable for text-to-speech systems, however not as suitable for ASR (speech recognition) systems because it excludes lower-quality audio.## Length\n\nUnfortunately, most phrases in this dataset are limited to 5 seconds."
] |
6893320ab798e836fa47d16579ed5ffe319ffd1a | # Dataset Card for "opensinger_extract_unit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Codec-SUPERB/opensinger_extract_unit | [
"region:us"
] | 2023-12-24T04:20:25+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "academicodec_hifi_16k_320d", "path": "data/academicodec_hifi_16k_320d-*"}, {"split": "academicodec_hifi_16k_320d_large_uni", "path": "data/academicodec_hifi_16k_320d_large_uni-*"}, {"split": "academicodec_hifi_24k_320d", "path": "data/academicodec_hifi_24k_320d-*"}, {"split": "audiodec_24k_320d", "path": "data/audiodec_24k_320d-*"}, {"split": "dac_16k", "path": "data/dac_16k-*"}, {"split": "dac_24k", "path": "data/dac_24k-*"}, {"split": "dac_44k", "path": "data/dac_44k-*"}, {"split": "encodec_24k", "path": "data/encodec_24k-*"}, {"split": "funcodec_en_libritts_16k_gr1nq32ds320", "path": "data/funcodec_en_libritts_16k_gr1nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_gr8nq32ds320", "path": "data/funcodec_en_libritts_16k_gr8nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds320", "path": "data/funcodec_en_libritts_16k_nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds640", "path": "data/funcodec_en_libritts_16k_nq32ds640-*"}, {"split": "funcodec_zh_en_16k_nq32ds320", "path": "data/funcodec_zh_en_16k_nq32ds320-*"}, {"split": "funcodec_zh_en_16k_nq32ds640", "path": "data/funcodec_zh_en_16k_nq32ds640-*"}, {"split": "speech_tokenizer_16k", "path": "data/speech_tokenizer_16k-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "unit", "sequence": {"sequence": "int64"}}], "splits": [{"name": "academicodec_hifi_16k_320d", "num_bytes": 300416100, "num_examples": 43075}, {"name": "academicodec_hifi_16k_320d_large_uni", "num_bytes": 300416100, "num_examples": 43075}, {"name": "academicodec_hifi_24k_320d", "num_bytes": 449971396, "num_examples": 43075}, {"name": "audiodec_24k_320d", "num_bytes": 961193172, "num_examples": 43075}, {"name": "dac_16k", "num_bytes": 1897940708, "num_examples": 43075}, {"name": "dac_24k", "num_bytes": 5413713908, "num_examples": 43075}, {"name": "dac_44k", "num_bytes": 1613103224, "num_examples": 43075}, {"name": "encodec_24k", "num_bytes": 226324972, "num_examples": 43075}, {"name": "funcodec_en_libritts_16k_gr1nq32ds320", "num_bytes": 2405254132, "num_examples": 43075}, {"name": "funcodec_en_libritts_16k_gr8nq32ds320", "num_bytes": 2405254132, "num_examples": 43075}, {"name": "funcodec_en_libritts_16k_nq32ds320", "num_bytes": 2405215988, "num_examples": 43075}, {"name": "funcodec_en_libritts_16k_nq32ds640", "num_bytes": 1208818932, "num_examples": 43075}, {"name": "funcodec_zh_en_16k_nq32ds320", "num_bytes": 2405215988, "num_examples": 43075}, {"name": "funcodec_zh_en_16k_nq32ds640", "num_bytes": 2405215988, "num_examples": 43075}, {"name": "speech_tokenizer_16k", "num_bytes": 602279828, "num_examples": 43075}], "download_size": 3902403817, "dataset_size": 25000334568}} | 2023-12-24T04:28:52+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "opensinger_extract_unit"
More Information needed | [
"# Dataset Card for \"opensinger_extract_unit\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"opensinger_extract_unit\"\n\nMore Information needed"
] | [
6,
19
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"opensinger_extract_unit\"\n\nMore Information needed"
] |
5bb167b766bd1d3e3382c36941a638ccd9fa626c |
# Dataset Card for Dataset Name
A Human+LLM annotated dataset of Wikipedia search terms. It is a collection of questions from the [Trivia QA](https://huggingface.co/datasets/trivia_qa) dataset, along with the terms to search for
in Wikipedia, that might help in answer the question.
The annotation was done by using the self-instruct format. A small subset of annotations were done by humans (the author) and were used as k-shot examples to feed into the
Gemini-Pro model to annotate the rest of the dataset.
## Dataset Details
### Dataset Description
- **Curated by:** [Mohit Raghavendra]
## Uses
This can be used to fine-tune an agent, given a question, to find the term to search for in Wikipedia.
## Dataset Creation
### Source Data
TriviaQA dataset - [https://huggingface.co/datasets/trivia_qa](https://huggingface.co/datasets/trivia_qa)
#### Data Collection and Processing
The data is a subsample of the TriviaQA dataset, specifically the first **1%** of the training split from the TriviaQA dataset.
```python
datasets.load_dataset("trivia_qa", "rc.nocontext", split="train[:1%]")
```
### Annotations [optional]
The first 30 examples in the dataset are annotated by the author.
These are then used as k-shot examples (k=10) to instruct the Gemini-Pro model to label the rest of the dataset.
The following system message was used to instruct the model, followed by examples:
```python
SYSTEM_MESSAGE = f"""There exists a wikipedia summarizer that can return a summary for a topic. \
Your job is to act as an aid to a question answering tool. Whenever you are asked about a question related to general knowledge, \
instead of using your internal knowledge (which can be faulty or out of date), \
format a Wikipedia search query string that can help answer the question. \
Wikipedia Entries are usually about a simple entity or event, so keep the \
query short, and about the entity being asked about. Also, don't use your knowledge \
to ask about the answer. Instead form queries about the entity in the question. This \
will help you get the right wikipedia entries for questions when you dont know the answer
"""
```
## Dataset Card Authors [optional]
Mohit Raghavendra
## Dataset Card Contact
[https://www.linkedin.com/in/mohit-r/](https://www.linkedin.com/in/mohit-r/)
| mohit-raghavendra/self-instruct-wikipedia | [
"license:apache-2.0",
"region:us"
] | 2023-12-24T04:53:14+00:00 | {"license": "apache-2.0", "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "query_terms", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 116267, "num_examples": 1384}], "download_size": 82027, "dataset_size": 116267}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-24T05:18:20+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
# Dataset Card for Dataset Name
A Human+LLM annotated dataset of Wikipedia search terms. It is a collection of questions from the Trivia QA dataset, along with the terms to search for
in Wikipedia, that might help in answer the question.
The annotation was done by using the self-instruct format. A small subset of annotations were done by humans (the author) and were used as k-shot examples to feed into the
Gemini-Pro model to annotate the rest of the dataset.
## Dataset Details
### Dataset Description
- Curated by: [Mohit Raghavendra]
## Uses
This can be used to fine-tune an agent, given a question, to find the term to search for in Wikipedia.
## Dataset Creation
### Source Data
TriviaQA dataset - URL
#### Data Collection and Processing
The data is a subsample of the TriviaQA dataset, specifically the first 1% of the training split from the TriviaQA dataset.
### Annotations [optional]
The first 30 examples in the dataset are annotated by the author.
These are then used as k-shot examples (k=10) to instruct the Gemini-Pro model to label the rest of the dataset.
The following system message was used to instruct the model, followed by examples:
## Dataset Card Authors [optional]
Mohit Raghavendra
## Dataset Card Contact
URL
| [
"# Dataset Card for Dataset Name\n\nA Human+LLM annotated dataset of Wikipedia search terms. It is a collection of questions from the Trivia QA dataset, along with the terms to search for \nin Wikipedia, that might help in answer the question. \n\nThe annotation was done by using the self-instruct format. A small subset of annotations were done by humans (the author) and were used as k-shot examples to feed into the \nGemini-Pro model to annotate the rest of the dataset.",
"## Dataset Details",
"### Dataset Description\n\n- Curated by: [Mohit Raghavendra]",
"## Uses\n\nThis can be used to fine-tune an agent, given a question, to find the term to search for in Wikipedia.",
"## Dataset Creation",
"### Source Data\n\nTriviaQA dataset - URL",
"#### Data Collection and Processing\n\nThe data is a subsample of the TriviaQA dataset, specifically the first 1% of the training split from the TriviaQA dataset.",
"### Annotations [optional]\n\nThe first 30 examples in the dataset are annotated by the author. \n\nThese are then used as k-shot examples (k=10) to instruct the Gemini-Pro model to label the rest of the dataset.\n\nThe following system message was used to instruct the model, followed by examples:",
"## Dataset Card Authors [optional]\n\nMohit Raghavendra",
"## Dataset Card Contact\n\nURL"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# Dataset Card for Dataset Name\n\nA Human+LLM annotated dataset of Wikipedia search terms. It is a collection of questions from the Trivia QA dataset, along with the terms to search for \nin Wikipedia, that might help in answer the question. \n\nThe annotation was done by using the self-instruct format. A small subset of annotations were done by humans (the author) and were used as k-shot examples to feed into the \nGemini-Pro model to annotate the rest of the dataset.",
"## Dataset Details",
"### Dataset Description\n\n- Curated by: [Mohit Raghavendra]",
"## Uses\n\nThis can be used to fine-tune an agent, given a question, to find the term to search for in Wikipedia.",
"## Dataset Creation",
"### Source Data\n\nTriviaQA dataset - URL",
"#### Data Collection and Processing\n\nThe data is a subsample of the TriviaQA dataset, specifically the first 1% of the training split from the TriviaQA dataset.",
"### Annotations [optional]\n\nThe first 30 examples in the dataset are annotated by the author. \n\nThese are then used as k-shot examples (k=10) to instruct the Gemini-Pro model to label the rest of the dataset.\n\nThe following system message was used to instruct the model, followed by examples:",
"## Dataset Card Authors [optional]\n\nMohit Raghavendra",
"## Dataset Card Contact\n\nURL"
] | [
14,
114,
4,
17,
28,
5,
11,
38,
75,
15,
6
] | [
"passage: TAGS\n#license-apache-2.0 #region-us \n# Dataset Card for Dataset Name\n\nA Human+LLM annotated dataset of Wikipedia search terms. It is a collection of questions from the Trivia QA dataset, along with the terms to search for \nin Wikipedia, that might help in answer the question. \n\nThe annotation was done by using the self-instruct format. A small subset of annotations were done by humans (the author) and were used as k-shot examples to feed into the \nGemini-Pro model to annotate the rest of the dataset.## Dataset Details### Dataset Description\n\n- Curated by: [Mohit Raghavendra]## Uses\n\nThis can be used to fine-tune an agent, given a question, to find the term to search for in Wikipedia.## Dataset Creation### Source Data\n\nTriviaQA dataset - URL#### Data Collection and Processing\n\nThe data is a subsample of the TriviaQA dataset, specifically the first 1% of the training split from the TriviaQA dataset.### Annotations [optional]\n\nThe first 30 examples in the dataset are annotated by the author. \n\nThese are then used as k-shot examples (k=10) to instruct the Gemini-Pro model to label the rest of the dataset.\n\nThe following system message was used to instruct the model, followed by examples:## Dataset Card Authors [optional]\n\nMohit Raghavendra## Dataset Card Contact\n\nURL"
] |
3340c6a089eea8c6c7f3546ce047600cc2feb22f |
**Code-74k-ShareGPT-Vicuna**
This dataset is in Vicuna/ShareGPT format. There are around 74000 set of conversations. Each set having 2 conversations.
Python, Java, JavaScript, GO, C++, Rust etc. code with detailed explanation are provided.
This dataset has around 60~65% of Python code.
| cognitivecomputations/Code-74k-ShareGPT-Vicuna | [
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"code",
"region:us"
] | 2023-12-24T05:26:50+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "tags": ["code"]} | 2023-12-24T05:31:57+00:00 | [] | [
"en"
] | TAGS
#size_categories-10K<n<100K #language-English #license-apache-2.0 #code #region-us
|
Code-74k-ShareGPT-Vicuna
This dataset is in Vicuna/ShareGPT format. There are around 74000 set of conversations. Each set having 2 conversations.
Python, Java, JavaScript, GO, C++, Rust etc. code with detailed explanation are provided.
This dataset has around 60~65% of Python code.
| [] | [
"TAGS\n#size_categories-10K<n<100K #language-English #license-apache-2.0 #code #region-us \n"
] | [
32
] | [
"passage: TAGS\n#size_categories-10K<n<100K #language-English #license-apache-2.0 #code #region-us \n"
] |
71ab57deb642c730d04d0d0a75585d18ef4f1d8c |
# Alpaca GPT4 Hindi
This dataset is hindi translated filtered version of [alpaca-gpt4](https://huggingface.co/datasets/vicgalle/alpaca-gpt4) using [IndicTrans2](https://github.com/AI4Bharat/IndicTrans2). | BhabhaAI/alpaca-gpt4-hindi-trans | [
"task_categories:conversational",
"size_categories:10K<n<100K",
"language:hi",
"license:cc-by-nc-4.0",
"region:us"
] | 2023-12-24T05:30:20+00:00 | {"language": ["hi"], "license": "cc-by-nc-4.0", "size_categories": ["10K<n<100K"], "task_categories": ["conversational"], "pretty_name": "Alpaca GPT4 Hindi"} | 2023-12-30T04:51:18+00:00 | [] | [
"hi"
] | TAGS
#task_categories-conversational #size_categories-10K<n<100K #language-Hindi #license-cc-by-nc-4.0 #region-us
|
# Alpaca GPT4 Hindi
This dataset is hindi translated filtered version of alpaca-gpt4 using IndicTrans2. | [
"# Alpaca GPT4 Hindi\n\nThis dataset is hindi translated filtered version of alpaca-gpt4 using IndicTrans2."
] | [
"TAGS\n#task_categories-conversational #size_categories-10K<n<100K #language-Hindi #license-cc-by-nc-4.0 #region-us \n",
"# Alpaca GPT4 Hindi\n\nThis dataset is hindi translated filtered version of alpaca-gpt4 using IndicTrans2."
] | [
43,
31
] | [
"passage: TAGS\n#task_categories-conversational #size_categories-10K<n<100K #language-Hindi #license-cc-by-nc-4.0 #region-us \n# Alpaca GPT4 Hindi\n\nThis dataset is hindi translated filtered version of alpaca-gpt4 using IndicTrans2."
] |
a920e2242afeea39557fc7185997e850adb634eb |
# AniSpeech Dataset
Welcome to the AniSpeech dataset, a continually expanding collection of captioned anime voices brought to you by ShoukanLabs.
- As we label more and more audio, they'll automagically be uploaded here for use, seperated by language
---
## ANNOUNCMENTS:
- An upcoming update will add an immense ammount of data to the dataset... however... because we cannot manually go through this dataset we have had to rely on manual quality estimation, as such, speaker splits may be innacurate, this shouldnt impact finetuning multispeaker models, but when training single speaker models you may have to listen to multiple speakers to find missing data, we plan on eventually completely overhauling this dataset eventually
## Key Features
- **LJSpeech Format Compatibility:** The captions in this dataset can be converted to (recent changes have sacrificed native LJSpeech support for better captions) comply with the LJSpeech format, and we plan to offer conversion scripts to said format eventually.
- **Diverse Anime Voices:** Train your TTS models on high-quality vocal performances with variations in intonation, timbre, and pitch. The dataset offers a rich assortment of anime voices for creating generalised models.
- **Ideal for Generalized Models:** AniSpeech is a perfect choice for fine-tuning generalized models. With a diverse range of voices, it provides a solid foundation for training models that can handle a wide variety of speaking styles (all speakers are labeled with a seperate speaker id).
## Limitations
- **Single-Voice Fine-Tuning:** While AniSpeech excels in training foundation models (due to it's diversity), it's not recommended for fine-tuning on a single voice. Its strength lies in contributing to the development of versatile TTS models.
- **Dataset Curation:** Due to its size, manually curating the entire dataset can be impractical. If you encounter low-quality files or incorrect captions, we encourage you to contribute by creating a pull request to help maintain and improve the dataset.
## License
This dataset is released under the [MIT License](https://huggingface.co/datasets/ShoukanLabs/AniSpeech/raw/main/license).
Your contributions to the AniSpeech dataset are invaluable, and we appreciate your efforts in advancing the field of Text-to-Speech technology.
Happy coding and synthesizing!
| ShoukanLabs/AniSpeech | [
"task_categories:text-to-speech",
"size_categories:n<1K",
"language:en",
"license:mit",
"anime",
"speech",
"text-to-speech",
"voice",
"region:us"
] | 2023-12-24T06:49:56+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-speech"], "pretty_name": "AniSpeech", "tags": ["anime", "speech", "text-to-speech", "voice"], "dataset_info": {"features": [{"name": "audio", "dtype": "audio"}, {"name": "caption", "dtype": "string"}, {"name": "phonetic captions", "dtype": "string"}, {"name": "voice", "dtype": "string"}], "splits": [{"name": "ENGLISH", "num_bytes": 18875728249.368, "num_examples": 23656}], "download_size": 20449215803, "dataset_size": 18875728249.368}, "configs": [{"config_name": "default", "data_files": [{"split": "ENGLISH", "path": "data/ENGLISH-*"}]}]} | 2024-01-29T04:53:57+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-to-speech #size_categories-n<1K #language-English #license-mit #anime #speech #text-to-speech #voice #region-us
|
# AniSpeech Dataset
Welcome to the AniSpeech dataset, a continually expanding collection of captioned anime voices brought to you by ShoukanLabs.
- As we label more and more audio, they'll automagically be uploaded here for use, seperated by language
---
## ANNOUNCMENTS:
- An upcoming update will add an immense ammount of data to the dataset... however... because we cannot manually go through this dataset we have had to rely on manual quality estimation, as such, speaker splits may be innacurate, this shouldnt impact finetuning multispeaker models, but when training single speaker models you may have to listen to multiple speakers to find missing data, we plan on eventually completely overhauling this dataset eventually
## Key Features
- LJSpeech Format Compatibility: The captions in this dataset can be converted to (recent changes have sacrificed native LJSpeech support for better captions) comply with the LJSpeech format, and we plan to offer conversion scripts to said format eventually.
- Diverse Anime Voices: Train your TTS models on high-quality vocal performances with variations in intonation, timbre, and pitch. The dataset offers a rich assortment of anime voices for creating generalised models.
- Ideal for Generalized Models: AniSpeech is a perfect choice for fine-tuning generalized models. With a diverse range of voices, it provides a solid foundation for training models that can handle a wide variety of speaking styles (all speakers are labeled with a seperate speaker id).
## Limitations
- Single-Voice Fine-Tuning: While AniSpeech excels in training foundation models (due to it's diversity), it's not recommended for fine-tuning on a single voice. Its strength lies in contributing to the development of versatile TTS models.
- Dataset Curation: Due to its size, manually curating the entire dataset can be impractical. If you encounter low-quality files or incorrect captions, we encourage you to contribute by creating a pull request to help maintain and improve the dataset.
## License
This dataset is released under the MIT License.
Your contributions to the AniSpeech dataset are invaluable, and we appreciate your efforts in advancing the field of Text-to-Speech technology.
Happy coding and synthesizing!
| [
"# AniSpeech Dataset\n\nWelcome to the AniSpeech dataset, a continually expanding collection of captioned anime voices brought to you by ShoukanLabs.\n- As we label more and more audio, they'll automagically be uploaded here for use, seperated by language\n\n---",
"## ANNOUNCMENTS:\n- An upcoming update will add an immense ammount of data to the dataset... however... because we cannot manually go through this dataset we have had to rely on manual quality estimation, as such, speaker splits may be innacurate, this shouldnt impact finetuning multispeaker models, but when training single speaker models you may have to listen to multiple speakers to find missing data, we plan on eventually completely overhauling this dataset eventually",
"## Key Features\n\n- LJSpeech Format Compatibility: The captions in this dataset can be converted to (recent changes have sacrificed native LJSpeech support for better captions) comply with the LJSpeech format, and we plan to offer conversion scripts to said format eventually.\n\n- Diverse Anime Voices: Train your TTS models on high-quality vocal performances with variations in intonation, timbre, and pitch. The dataset offers a rich assortment of anime voices for creating generalised models.\n\n- Ideal for Generalized Models: AniSpeech is a perfect choice for fine-tuning generalized models. With a diverse range of voices, it provides a solid foundation for training models that can handle a wide variety of speaking styles (all speakers are labeled with a seperate speaker id).",
"## Limitations\n\n- Single-Voice Fine-Tuning: While AniSpeech excels in training foundation models (due to it's diversity), it's not recommended for fine-tuning on a single voice. Its strength lies in contributing to the development of versatile TTS models.\n\n- Dataset Curation: Due to its size, manually curating the entire dataset can be impractical. If you encounter low-quality files or incorrect captions, we encourage you to contribute by creating a pull request to help maintain and improve the dataset.",
"## License\n\nThis dataset is released under the MIT License.\n\nYour contributions to the AniSpeech dataset are invaluable, and we appreciate your efforts in advancing the field of Text-to-Speech technology.\n\nHappy coding and synthesizing!"
] | [
"TAGS\n#task_categories-text-to-speech #size_categories-n<1K #language-English #license-mit #anime #speech #text-to-speech #voice #region-us \n",
"# AniSpeech Dataset\n\nWelcome to the AniSpeech dataset, a continually expanding collection of captioned anime voices brought to you by ShoukanLabs.\n- As we label more and more audio, they'll automagically be uploaded here for use, seperated by language\n\n---",
"## ANNOUNCMENTS:\n- An upcoming update will add an immense ammount of data to the dataset... however... because we cannot manually go through this dataset we have had to rely on manual quality estimation, as such, speaker splits may be innacurate, this shouldnt impact finetuning multispeaker models, but when training single speaker models you may have to listen to multiple speakers to find missing data, we plan on eventually completely overhauling this dataset eventually",
"## Key Features\n\n- LJSpeech Format Compatibility: The captions in this dataset can be converted to (recent changes have sacrificed native LJSpeech support for better captions) comply with the LJSpeech format, and we plan to offer conversion scripts to said format eventually.\n\n- Diverse Anime Voices: Train your TTS models on high-quality vocal performances with variations in intonation, timbre, and pitch. The dataset offers a rich assortment of anime voices for creating generalised models.\n\n- Ideal for Generalized Models: AniSpeech is a perfect choice for fine-tuning generalized models. With a diverse range of voices, it provides a solid foundation for training models that can handle a wide variety of speaking styles (all speakers are labeled with a seperate speaker id).",
"## Limitations\n\n- Single-Voice Fine-Tuning: While AniSpeech excels in training foundation models (due to it's diversity), it's not recommended for fine-tuning on a single voice. Its strength lies in contributing to the development of versatile TTS models.\n\n- Dataset Curation: Due to its size, manually curating the entire dataset can be impractical. If you encounter low-quality files or incorrect captions, we encourage you to contribute by creating a pull request to help maintain and improve the dataset.",
"## License\n\nThis dataset is released under the MIT License.\n\nYour contributions to the AniSpeech dataset are invaluable, and we appreciate your efforts in advancing the field of Text-to-Speech technology.\n\nHappy coding and synthesizing!"
] | [
54,
66,
109,
185,
124,
57
] | [
"passage: TAGS\n#task_categories-text-to-speech #size_categories-n<1K #language-English #license-mit #anime #speech #text-to-speech #voice #region-us \n# AniSpeech Dataset\n\nWelcome to the AniSpeech dataset, a continually expanding collection of captioned anime voices brought to you by ShoukanLabs.\n- As we label more and more audio, they'll automagically be uploaded here for use, seperated by language\n\n---## ANNOUNCMENTS:\n- An upcoming update will add an immense ammount of data to the dataset... however... because we cannot manually go through this dataset we have had to rely on manual quality estimation, as such, speaker splits may be innacurate, this shouldnt impact finetuning multispeaker models, but when training single speaker models you may have to listen to multiple speakers to find missing data, we plan on eventually completely overhauling this dataset eventually## Key Features\n\n- LJSpeech Format Compatibility: The captions in this dataset can be converted to (recent changes have sacrificed native LJSpeech support for better captions) comply with the LJSpeech format, and we plan to offer conversion scripts to said format eventually.\n\n- Diverse Anime Voices: Train your TTS models on high-quality vocal performances with variations in intonation, timbre, and pitch. The dataset offers a rich assortment of anime voices for creating generalised models.\n\n- Ideal for Generalized Models: AniSpeech is a perfect choice for fine-tuning generalized models. With a diverse range of voices, it provides a solid foundation for training models that can handle a wide variety of speaking styles (all speakers are labeled with a seperate speaker id)."
] |
02b9a506be119b558338f907cc5f81ef39672200 |
# IMDA National Speech Corpus (NSC) Speech-to-Text
Originally from https://www.imda.gov.sg/how-we-can-help/national-speech-corpus, this repository simply a mirror. This dataset associated with Singapore Open Data Licence, https://www.sla.gov.sg/newsroom/statistics/singapore-open-data-licence
We uploaded mp3 files and compressed using 7z,
```bash
7za x part1-mp3.7z.001
```
All notebooks at https://github.com/mesolitica/malaysian-dataset/tree/master/speech-to-text/imda
## total lengths
1. part 1, 1117.9866586978865 hours
2. part 2, 1052.1777120312486 hours
3. part 3, 2162.4968734548597 hours
4. part 4, 2133.685638097089 hours
5. part 5, 2044.5220318402826 hours
6. part 6, 2148.2834793402703 hours
## Why no HuggingFace dataset format?
We had bad experiences with HuggingFace dataset format to load huge dataset. Reading mp3 files during iteration is much more faster and efficient. | mesolitica/IMDA-STT | [
"language:en",
"region:us"
] | 2023-12-24T07:18:28+00:00 | {"language": ["en"], "pretty_name": "imda-"} | 2023-12-28T06:55:01+00:00 | [] | [
"en"
] | TAGS
#language-English #region-us
|
# IMDA National Speech Corpus (NSC) Speech-to-Text
Originally from URL this repository simply a mirror. This dataset associated with Singapore Open Data Licence, URL
We uploaded mp3 files and compressed using 7z,
All notebooks at URL
## total lengths
1. part 1, 1117.9866586978865 hours
2. part 2, 1052.1777120312486 hours
3. part 3, 2162.4968734548597 hours
4. part 4, 2133.685638097089 hours
5. part 5, 2044.5220318402826 hours
6. part 6, 2148.2834793402703 hours
## Why no HuggingFace dataset format?
We had bad experiences with HuggingFace dataset format to load huge dataset. Reading mp3 files during iteration is much more faster and efficient. | [
"# IMDA National Speech Corpus (NSC) Speech-to-Text\n\nOriginally from URL this repository simply a mirror. This dataset associated with Singapore Open Data Licence, URL\n\nWe uploaded mp3 files and compressed using 7z,\n\n\n\nAll notebooks at URL",
"## total lengths\n\n1. part 1, 1117.9866586978865 hours\n2. part 2, 1052.1777120312486 hours\n3. part 3, 2162.4968734548597 hours\n4. part 4, 2133.685638097089 hours\n5. part 5, 2044.5220318402826 hours\n6. part 6, 2148.2834793402703 hours",
"## Why no HuggingFace dataset format?\n\nWe had bad experiences with HuggingFace dataset format to load huge dataset. Reading mp3 files during iteration is much more faster and efficient."
] | [
"TAGS\n#language-English #region-us \n",
"# IMDA National Speech Corpus (NSC) Speech-to-Text\n\nOriginally from URL this repository simply a mirror. This dataset associated with Singapore Open Data Licence, URL\n\nWe uploaded mp3 files and compressed using 7z,\n\n\n\nAll notebooks at URL",
"## total lengths\n\n1. part 1, 1117.9866586978865 hours\n2. part 2, 1052.1777120312486 hours\n3. part 3, 2162.4968734548597 hours\n4. part 4, 2133.685638097089 hours\n5. part 5, 2044.5220318402826 hours\n6. part 6, 2148.2834793402703 hours",
"## Why no HuggingFace dataset format?\n\nWe had bad experiences with HuggingFace dataset format to load huge dataset. Reading mp3 files during iteration is much more faster and efficient."
] | [
10,
58,
79,
45
] | [
"passage: TAGS\n#language-English #region-us \n# IMDA National Speech Corpus (NSC) Speech-to-Text\n\nOriginally from URL this repository simply a mirror. This dataset associated with Singapore Open Data Licence, URL\n\nWe uploaded mp3 files and compressed using 7z,\n\n\n\nAll notebooks at URL## total lengths\n\n1. part 1, 1117.9866586978865 hours\n2. part 2, 1052.1777120312486 hours\n3. part 3, 2162.4968734548597 hours\n4. part 4, 2133.685638097089 hours\n5. part 5, 2044.5220318402826 hours\n6. part 6, 2148.2834793402703 hours## Why no HuggingFace dataset format?\n\nWe had bad experiences with HuggingFace dataset format to load huge dataset. Reading mp3 files during iteration is much more faster and efficient."
] |
2b70fff5216d1ff94fb6bb24b61df8f731cfd870 |
Translated from English to Hindi using Google Translation API.
Transliterated from Hindi to Hinglish using [libindic/indic-trans](https://github.com/libindic/indic-trans). | rishiraj/hinglish | [
"region:us"
] | 2023-12-24T07:20:48+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "category", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 31866063, "num_examples": 9500}, {"name": "test", "num_bytes": 1712744, "num_examples": 500}], "download_size": 19747792, "dataset_size": 33578807}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2023-12-24T18:28:51+00:00 | [] | [] | TAGS
#region-us
|
Translated from English to Hindi using Google Translation API.
Transliterated from Hindi to Hinglish using libindic/indic-trans. | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
a37a946825014e5eecbc35a820708361f26c0215 | Reference materials for training clothing, for reference in the Waifuc project.
I. Explanation
The "1_origin" folder contains the normal files.
The "4_faceless" folder contains the files with the head removed.
II. Actual results:
SD1.5:https://civitai.com/models/183962?modelVersionId=206475
XL:https://civitai.com/models/234715/xl-mayhem-maid-clothes-hutten
训练服装的参考素材,给waifuc项目参考
一,说明
1_origin文件夹里面是正常的文件。
4_faceless文件夹里面是去掉头部的文件
二、实际效果:
SD1.5:https://civitai.com/models/183962?modelVersionId=206475
XL:https://civitai.com/models/234715/xl-mayhem-maid-clothes-hutten | windsingai/headout_example | [
"region:us"
] | 2023-12-24T07:32:17+00:00 | {} | 2023-12-24T07:41:37+00:00 | [] | [] | TAGS
#region-us
| Reference materials for training clothing, for reference in the Waifuc project.
I. Explanation
The "1_origin" folder contains the normal files.
The "4_faceless" folder contains the files with the head removed.
II. Actual results:
SD1.5:URL
XL:URL
训练服装的参考素材,给waifuc项目参考
一,说明
1_origin文件夹里面是正常的文件。
4_faceless文件夹里面是去掉头部的文件
二、实际效果:
SD1.5:URL
XL:URL | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
6f0b721311a33736b19434e276f04727d5c2e213 | # Dataset Card for "WritingPrompts_preferences"
Human preference data from r/WritingPrompts | euclaise/WritingPrompts_preferences | [
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:en",
"license:mit",
"region:us"
] | 2023-12-24T09:30:07+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["100K<n<1M"], "task_categories": ["text-generation"], "pretty_name": "WritingPrompts Preferences", "dataset_info": {"features": [{"name": "post_text", "dtype": "string"}, {"name": "post_title", "dtype": "string"}, {"name": "post_scores", "dtype": "int64"}, {"name": "comment_texts", "sequence": "string"}, {"name": "comment_scores", "sequence": "int64"}, {"name": "comment_times", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 2340246558, "num_examples": 265174}], "download_size": 1357734208, "dataset_size": 2340246558}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-25T13:57:46+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-100K<n<1M #language-English #license-mit #region-us
| # Dataset Card for "WritingPrompts_preferences"
Human preference data from r/WritingPrompts | [
"# Dataset Card for \"WritingPrompts_preferences\"\n\nHuman preference data from r/WritingPrompts"
] | [
"TAGS\n#task_categories-text-generation #size_categories-100K<n<1M #language-English #license-mit #region-us \n",
"# Dataset Card for \"WritingPrompts_preferences\"\n\nHuman preference data from r/WritingPrompts"
] | [
38,
29
] | [
"passage: TAGS\n#task_categories-text-generation #size_categories-100K<n<1M #language-English #license-mit #region-us \n# Dataset Card for \"WritingPrompts_preferences\"\n\nHuman preference data from r/WritingPrompts"
] |
fb0cd374e3920985664638da0549bd4354560d08 |
# Flan-v2 filtered
This dataset is a version of the original Flanv2 but much reduced
- only CoT task few-shot samples are here, both opt and noopt
- samples with <100 tokens in response have been removed
| Roudranil/Flan-v2-Filtered | [
"region:us"
] | 2023-12-24T09:35:01+00:00 | {"dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "task", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 20917402.4513315, "num_examples": 12635}], "download_size": 17409381, "dataset_size": 20917402.4513315}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-24T09:37:19+00:00 | [] | [] | TAGS
#region-us
|
# Flan-v2 filtered
This dataset is a version of the original Flanv2 but much reduced
- only CoT task few-shot samples are here, both opt and noopt
- samples with <100 tokens in response have been removed
| [
"# Flan-v2 filtered\n\nThis dataset is a version of the original Flanv2 but much reduced\n- only CoT task few-shot samples are here, both opt and noopt\n- samples with <100 tokens in response have been removed"
] | [
"TAGS\n#region-us \n",
"# Flan-v2 filtered\n\nThis dataset is a version of the original Flanv2 but much reduced\n- only CoT task few-shot samples are here, both opt and noopt\n- samples with <100 tokens in response have been removed"
] | [
6,
57
] | [
"passage: TAGS\n#region-us \n# Flan-v2 filtered\n\nThis dataset is a version of the original Flanv2 but much reduced\n- only CoT task few-shot samples are here, both opt and noopt\n- samples with <100 tokens in response have been removed"
] |
179dd21fc55192153d94adb0e0ce8f69e222bf75 |
# Open Assistant Conversations Dataset Release 2 (OASST2)
## Dataset Description
- **Homepage:** https://www.open-assistant.io/
- **Repository:** https://github.com/LAION-AI/Open-Assistant
- **Paper:** https://arxiv.org/abs/2304.07327
### Dataset Structure
This dataset contains message trees. Each message tree has an initial prompt message as the root node,
which can have multiple child messages as replies, and these child messages can have multiple replies.
All messages have a role property: this can either be "assistant" or "prompter". The roles in
conversation threads from prompt to leaf node strictly alternate between "prompter" and "assistant".
This version of the dataset contains data collected on the [open-assistant.io](https://open-assistant.io/) website until Nov 5 2023.
### JSON Example: Message
For readability, the following JSON examples are shown formatted with indentation on multiple lines.
Objects are stored without indentation (on single lines) in the actual jsonl files.
```json
{
"message_id": "218440fd-5317-4355-91dc-d001416df62b",
"parent_id": "13592dfb-a6f9-4748-a92c-32b34e239bb4",
"user_id": "8e95461f-5e94-4d8b-a2fb-d4717ce973e4",
"text": "It was the winter of 2035, and artificial intelligence (..)",
"role": "assistant",
"lang": "en",
"review_count": 3,
"review_result": true,
"deleted": false,
"rank": 0,
"synthetic": true,
"model_name": "oasst-sft-0_3000,max_new_tokens=400 (..)",
"labels": {
"spam": { "value": 0.0, "count": 3 },
"lang_mismatch": { "value": 0.0, "count": 3 },
"pii": { "value": 0.0, "count": 3 },
"not_appropriate": { "value": 0.0, "count": 3 },
"hate_speech": { "value": 0.0, "count": 3 },
"sexual_content": { "value": 0.0, "count": 3 },
"quality": { "value": 0.416, "count": 3 },
"toxicity": { "value": 0.16, "count": 3 },
"humor": { "value": 0.0, "count": 3 },
"creativity": { "value": 0.33, "count": 3 },
"violence": { "value": 0.16, "count": 3 }
}
}
```
### JSON Example: Conversation Tree
For readability, only a subset of the message properties is shown here.
```json
{
"message_tree_id": "14fbb664-a620-45ce-bee4-7c519b16a793",
"tree_state": "ready_for_export",
"prompt": {
"message_id": "14fbb664-a620-45ce-bee4-7c519b16a793",
"text": "Why can't we divide by 0? (..)",
"role": "prompter",
"lang": "en",
"replies": [
{
"message_id": "894d30b6-56b4-4605-a504-89dd15d4d1c8",
"text": "The reason we cannot divide by zero is because (..)",
"role": "assistant",
"lang": "en",
"replies": [
// ...
]
},
{
"message_id": "84d0913b-0fd9-4508-8ef5-205626a7039d",
"text": "The reason that the result of a division by zero is (..)",
"role": "assistant",
"lang": "en",
"replies": [
{
"message_id": "3352725e-f424-4e3b-a627-b6db831bdbaa",
"text": "Math is confusing. Like those weird Irrational (..)",
"role": "prompter",
"lang": "en",
"replies": [
{
"message_id": "f46207ca-3149-46e9-a466-9163d4ce499c",
"text": "Irrational numbers are simply numbers (..)",
"role": "assistant",
"lang": "en",
"replies": []
},
// ...
]
}
]
}
]
}
}
```
Please refer to [oasst-data](https://github.com/LAION-AI/Open-Assistant/tree/main/oasst-data) for
details about the data structure and Python code to read and write jsonl files containing oasst data objects.
## Main Dataset Files
Conversation data is provided either as nested messages in trees (extension `.trees.jsonl.gz`)
or as a flat list (table) of messages (extension `.messages.jsonl.gz`).
### Ready For Export Trees
```
2023-11-05_oasst2_ready.trees.jsonl.gz 13,854 trees with 135,174 total messages
2023-11-05_oasst2_ready.messages.jsonl.gz 135,174 messages
```
#### 2023-11-05_oasst2_ready.trees.jsonl.gz Stats
```
Trees : 13,854
Messages : 135,174
Oldest message : 2023-01-16 20:24:26.211711+00:00
Youngest message : 2023-11-04 15:23:03.239343+00:00
Detoxify ratings : 111,448
Accepted messages: 129,517
Deleted messages : 4,376
Tree counts by state:
- ready_for_export: 13,854
Message counts by language:
- en: 64,513
- es: 28,199
- ru: 13,935
- zh: 8,615
- de: 6,145
- fr: 3,880
- pt-BR: 2,699
- th: 1,560
- ca: 1,283
- it: 943
- uk-UA: 845
- ja: 788
- pl: 435
- eo: 295
- eu: 274
- vi: 207
- fi: 138
- hu: 113
- ar: 80
- nl: 72
- da: 44
- tr: 37
- ko: 24
- he: 24
- id: 12
- cs: 12
- bn: 1
- sv: 1
```
Trees in ready_for_export state without spam and deleted messages including message labels. The oasst_ready-trees file usually is sufficient for supervised fine-tuning (SFT) & reward model (RM) training.
### All Trees
```
2023-11-05_oasst2_all.trees.jsonl.gz 70,642 trees with 208,584 total messages
2023-11-05_oasst2_all.messages.jsonl.gz 208,584 messages
```
All trees, including those in states prompt_lottery_waiting (trees that consist of only one message, namely the initial prompt), aborted_low_grade (trees that stopped growing because the messages had low quality), and halted_by_moderator.
#### 2023-11-05_oasst2_all.trees.jsonl.gz Stats
```
Trees : 70,642
Messages : 208,584
Oldest message : 2023-01-16 20:24:26.211711+00:00
Youngest message : 2023-11-05 10:24:44.484910+00:00
Detoxify ratings : 156,570
Accepted messages: 189,288
Deleted messages : 5,414
Tree counts by state:
- ready_for_export: 13,854
- prompt_lottery_waiting: 44,550
- halted_by_moderator: 3,089
- initial_prompt_review: 4,319
- growing: 3,102
- aborted_low_grade: 1,708
- ranking: 20
Message counts by language:
- en: 85,115
- es: 47,513
- ru: 15,990
- zh: 11,205
- de: 8,398
- fr: 5,841
- pt-BR: 4,540
- th: 3,236
- ca: 2,586
- it: 2,144
- ja: 1,904
- uk-UA: 1,889
- ko: 1,635
- pl: 1,510
- eo: 1,405
- nl: 1,354
- ar: 1,274
- vi: 1,137
- fi: 1,098
- eu: 995
- hu: 961
- tr: 803
- sv: 763
- id: 669
- gl: 574
- da: 502
- he: 498
- cs: 476
- ro: 434
- sk: 410
- fa: 394
- el: 388
- bar: 217
- nb-NO: 196
- bg: 176
- bn: 128
- sl: 119
- sr: 63
- swg: 23
- hi: 14
- lt: 7
```
### Supplemental Exports: Spam & Prompts
```
2023-11-05_oasst2_spam.messages.jsonl.gz 19,296 matching messages
```
These are messages which were deleted or have a negative review result ("review_result": false). Besides low quality, a frequent reason for message deletion is a wrong language tag.
```
2023-11-05_oasst2_prompts.messages.jsonl.gz 64,592 matching messages
```
These are all the kept initial prompt messages with positive review result (no spam) of trees in `ready_for_export` or `prompt_lottery_waiting` state.
### Using the Huggingface Datasets
While HF datasets is ideal for tabular datasets, it is not a natural fit for nested data structures like the OpenAssistant conversation trees.
Nevertheless, we make all messages which can also be found in the file `2023-11-05_oasst2_ready.messages.jsonl.gz` available in parquet format as train/validation splits.
These are directly loadable by [Huggingface Datasets](https://pypi.org/project/datasets/).
To load the oasst2 train & validation splits use:
```python
from datasets import load_dataset
ds = load_dataset("OpenAssistant/oasst2")
train = ds['train'] # len(train)=128575 (95%)
val = ds['validation'] # len(val)=6599 (5%)
```
The messages appear in depth-first order of the message trees.
Full conversation trees can be reconstructed from the flat messages table by using the `parent_id`
and `message_id` properties to identify the parent-child relationship of messages. The `message_tree_id`
and `tree_state` properties (only present in flat messages files) can be used to find all messages of a message tree or to select trees by their state.
### Data Visualisation
Explore the content of the prompts from the English subset using [Bunka](https://github.com/charlesdedampierre/BunkaTopics) open-source visualization technology.
The interactive map [available on a HF space](https://huggingface.co/spaces/bunkalab/visualisation-oasst2) allows to explore each datapoint to get a more precise overview of the contents.
<a href="https://i.imgur.com/B2H8LR3.png">
<img src="https://i.imgur.com/B2H8LR3.png" alt="Bunka oasst2 Map" width="35%"/>
</a>
## Contact
- Discord [Open Assistant Discord Server](https://ykilcher.com/open-assistant-discord)
- GitHub: [LAION-AI/Open-Assistant](https://github.com/LAION-AI/Open-Assistant)
- E-Mail: [[email protected]](mailto:[email protected])
| OpenAssistant/oasst2 | [
"size_categories:100K<n<1M",
"language:en",
"language:es",
"language:ru",
"language:de",
"language:pl",
"language:th",
"language:vi",
"language:sv",
"language:bn",
"language:da",
"language:he",
"language:it",
"language:fa",
"language:sk",
"language:id",
"language:nb",
"language:el",
"language:nl",
"language:hu",
"language:eu",
"language:zh",
"language:eo",
"language:ja",
"language:ca",
"language:cs",
"language:bg",
"language:fi",
"language:pt",
"language:tr",
"language:ro",
"language:ar",
"language:uk",
"language:gl",
"language:fr",
"language:ko",
"license:apache-2.0",
"human-feedback",
"arxiv:2304.07327",
"region:us"
] | 2023-12-24T09:53:24+00:00 | {"language": ["en", "es", "ru", "de", "pl", "th", "vi", "sv", "bn", "da", "he", "it", "fa", "sk", "id", "nb", "el", "nl", "hu", "eu", "zh", "eo", "ja", "ca", "cs", "bg", "fi", "pt", "tr", "ro", "ar", "uk", "gl", "fr", "ko"], "license": "apache-2.0", "size_categories": ["100K<n<1M"], "pretty_name": "OpenAssistant Conversations Release 2", "dataset_info": {"features": [{"name": "message_id", "dtype": "string"}, {"name": "parent_id", "dtype": "string"}, {"name": "user_id", "dtype": "string"}, {"name": "created_date", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "role", "dtype": "string"}, {"name": "lang", "dtype": "string"}, {"name": "review_count", "dtype": "int32"}, {"name": "review_result", "dtype": "bool"}, {"name": "deleted", "dtype": "bool"}, {"name": "rank", "dtype": "int32"}, {"name": "synthetic", "dtype": "bool"}, {"name": "model_name", "dtype": "string"}, {"name": "detoxify", "struct": [{"name": "toxicity", "dtype": "float64"}, {"name": "severe_toxicity", "dtype": "float64"}, {"name": "obscene", "dtype": "float64"}, {"name": "identity_attack", "dtype": "float64"}, {"name": "insult", "dtype": "float64"}, {"name": "threat", "dtype": "float64"}, {"name": "sexual_explicit", "dtype": "float64"}]}, {"name": "message_tree_id", "dtype": "string"}, {"name": "tree_state", "dtype": "string"}, {"name": "emojis", "sequence": [{"name": "name", "dtype": "string"}, {"name": "count", "dtype": "int32"}]}, {"name": "labels", "sequence": [{"name": "name", "dtype": "string"}, {"name": "value", "dtype": "float64"}, {"name": "count", "dtype": "int32"}]}], "splits": [{"name": "train", "num_bytes": 158850455, "num_examples": 128575}, {"name": "validation", "num_bytes": 7963122, "num_examples": 6599}], "download_size": 66674129, "dataset_size": 166813577}, "tags": ["human-feedback"]} | 2024-01-11T06:09:29+00:00 | [
"2304.07327"
] | [
"en",
"es",
"ru",
"de",
"pl",
"th",
"vi",
"sv",
"bn",
"da",
"he",
"it",
"fa",
"sk",
"id",
"nb",
"el",
"nl",
"hu",
"eu",
"zh",
"eo",
"ja",
"ca",
"cs",
"bg",
"fi",
"pt",
"tr",
"ro",
"ar",
"uk",
"gl",
"fr",
"ko"
] | TAGS
#size_categories-100K<n<1M #language-English #language-Spanish #language-Russian #language-German #language-Polish #language-Thai #language-Vietnamese #language-Swedish #language-Bengali #language-Danish #language-Hebrew #language-Italian #language-Persian #language-Slovak #language-Indonesian #language-Norwegian Bokmål #language-Modern Greek (1453-) #language-Dutch #language-Hungarian #language-Basque #language-Chinese #language-Esperanto #language-Japanese #language-Catalan #language-Czech #language-Bulgarian #language-Finnish #language-Portuguese #language-Turkish #language-Romanian #language-Arabic #language-Ukrainian #language-Galician #language-French #language-Korean #license-apache-2.0 #human-feedback #arxiv-2304.07327 #region-us
|
# Open Assistant Conversations Dataset Release 2 (OASST2)
## Dataset Description
- Homepage: URL
- Repository: URL
- Paper: URL
### Dataset Structure
This dataset contains message trees. Each message tree has an initial prompt message as the root node,
which can have multiple child messages as replies, and these child messages can have multiple replies.
All messages have a role property: this can either be "assistant" or "prompter". The roles in
conversation threads from prompt to leaf node strictly alternate between "prompter" and "assistant".
This version of the dataset contains data collected on the URL website until Nov 5 2023.
### JSON Example: Message
For readability, the following JSON examples are shown formatted with indentation on multiple lines.
Objects are stored without indentation (on single lines) in the actual jsonl files.
### JSON Example: Conversation Tree
For readability, only a subset of the message properties is shown here.
Please refer to oasst-data for
details about the data structure and Python code to read and write jsonl files containing oasst data objects.
## Main Dataset Files
Conversation data is provided either as nested messages in trees (extension '.URL')
or as a flat list (table) of messages (extension '.URL').
### Ready For Export Trees
#### 2023-11-05_oasst2_ready.URL Stats
Trees in ready_for_export state without spam and deleted messages including message labels. The oasst_ready-trees file usually is sufficient for supervised fine-tuning (SFT) & reward model (RM) training.
### All Trees
All trees, including those in states prompt_lottery_waiting (trees that consist of only one message, namely the initial prompt), aborted_low_grade (trees that stopped growing because the messages had low quality), and halted_by_moderator.
#### 2023-11-05_oasst2_all.URL Stats
### Supplemental Exports: Spam & Prompts
These are messages which were deleted or have a negative review result ("review_result": false). Besides low quality, a frequent reason for message deletion is a wrong language tag.
These are all the kept initial prompt messages with positive review result (no spam) of trees in 'ready_for_export' or 'prompt_lottery_waiting' state.
### Using the Huggingface Datasets
While HF datasets is ideal for tabular datasets, it is not a natural fit for nested data structures like the OpenAssistant conversation trees.
Nevertheless, we make all messages which can also be found in the file '2023-11-05_oasst2_ready.URL' available in parquet format as train/validation splits.
These are directly loadable by Huggingface Datasets.
To load the oasst2 train & validation splits use:
The messages appear in depth-first order of the message trees.
Full conversation trees can be reconstructed from the flat messages table by using the 'parent_id'
and 'message_id' properties to identify the parent-child relationship of messages. The 'message_tree_id'
and 'tree_state' properties (only present in flat messages files) can be used to find all messages of a message tree or to select trees by their state.
### Data Visualisation
Explore the content of the prompts from the English subset using Bunka open-source visualization technology.
The interactive map available on a HF space allows to explore each datapoint to get a more precise overview of the contents.
<a href="https://i.URL
<img src="https://i.URL alt="Bunka oasst2 Map" width="35%"/>
</a>
## Contact
- Discord Open Assistant Discord Server
- GitHub: LAION-AI/Open-Assistant
- E-Mail: open-assistant@URL
| [
"# Open Assistant Conversations Dataset Release 2 (OASST2)",
"## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: URL",
"### Dataset Structure\n\nThis dataset contains message trees. Each message tree has an initial prompt message as the root node, \nwhich can have multiple child messages as replies, and these child messages can have multiple replies. \n\nAll messages have a role property: this can either be \"assistant\" or \"prompter\". The roles in \nconversation threads from prompt to leaf node strictly alternate between \"prompter\" and \"assistant\".\n\nThis version of the dataset contains data collected on the URL website until Nov 5 2023.",
"### JSON Example: Message\n\nFor readability, the following JSON examples are shown formatted with indentation on multiple lines.\nObjects are stored without indentation (on single lines) in the actual jsonl files.",
"### JSON Example: Conversation Tree\n\nFor readability, only a subset of the message properties is shown here.\n\n\n\nPlease refer to oasst-data for\ndetails about the data structure and Python code to read and write jsonl files containing oasst data objects.",
"## Main Dataset Files\n\nConversation data is provided either as nested messages in trees (extension '.URL') \nor as a flat list (table) of messages (extension '.URL').",
"### Ready For Export Trees",
"#### 2023-11-05_oasst2_ready.URL Stats\n\n\nTrees in ready_for_export state without spam and deleted messages including message labels. The oasst_ready-trees file usually is sufficient for supervised fine-tuning (SFT) & reward model (RM) training.",
"### All Trees\n\n\n\nAll trees, including those in states prompt_lottery_waiting (trees that consist of only one message, namely the initial prompt), aborted_low_grade (trees that stopped growing because the messages had low quality), and halted_by_moderator.",
"#### 2023-11-05_oasst2_all.URL Stats",
"### Supplemental Exports: Spam & Prompts \n\n\n\nThese are messages which were deleted or have a negative review result (\"review_result\": false). Besides low quality, a frequent reason for message deletion is a wrong language tag.\n\n\n\nThese are all the kept initial prompt messages with positive review result (no spam) of trees in 'ready_for_export' or 'prompt_lottery_waiting' state.",
"### Using the Huggingface Datasets\n\nWhile HF datasets is ideal for tabular datasets, it is not a natural fit for nested data structures like the OpenAssistant conversation trees.\nNevertheless, we make all messages which can also be found in the file '2023-11-05_oasst2_ready.URL' available in parquet format as train/validation splits. \nThese are directly loadable by Huggingface Datasets.\n\nTo load the oasst2 train & validation splits use:\n\n\n\nThe messages appear in depth-first order of the message trees.\n\nFull conversation trees can be reconstructed from the flat messages table by using the 'parent_id' \nand 'message_id' properties to identify the parent-child relationship of messages. The 'message_tree_id' \nand 'tree_state' properties (only present in flat messages files) can be used to find all messages of a message tree or to select trees by their state.",
"### Data Visualisation\n\nExplore the content of the prompts from the English subset using Bunka open-source visualization technology. \nThe interactive map available on a HF space allows to explore each datapoint to get a more precise overview of the contents.\n\n<a href=\"https://i.URL\n <img src=\"https://i.URL alt=\"Bunka oasst2 Map\" width=\"35%\"/>\n</a>",
"## Contact\n\n- Discord Open Assistant Discord Server\n- GitHub: LAION-AI/Open-Assistant\n- E-Mail: open-assistant@URL"
] | [
"TAGS\n#size_categories-100K<n<1M #language-English #language-Spanish #language-Russian #language-German #language-Polish #language-Thai #language-Vietnamese #language-Swedish #language-Bengali #language-Danish #language-Hebrew #language-Italian #language-Persian #language-Slovak #language-Indonesian #language-Norwegian Bokmål #language-Modern Greek (1453-) #language-Dutch #language-Hungarian #language-Basque #language-Chinese #language-Esperanto #language-Japanese #language-Catalan #language-Czech #language-Bulgarian #language-Finnish #language-Portuguese #language-Turkish #language-Romanian #language-Arabic #language-Ukrainian #language-Galician #language-French #language-Korean #license-apache-2.0 #human-feedback #arxiv-2304.07327 #region-us \n",
"# Open Assistant Conversations Dataset Release 2 (OASST2)",
"## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: URL",
"### Dataset Structure\n\nThis dataset contains message trees. Each message tree has an initial prompt message as the root node, \nwhich can have multiple child messages as replies, and these child messages can have multiple replies. \n\nAll messages have a role property: this can either be \"assistant\" or \"prompter\". The roles in \nconversation threads from prompt to leaf node strictly alternate between \"prompter\" and \"assistant\".\n\nThis version of the dataset contains data collected on the URL website until Nov 5 2023.",
"### JSON Example: Message\n\nFor readability, the following JSON examples are shown formatted with indentation on multiple lines.\nObjects are stored without indentation (on single lines) in the actual jsonl files.",
"### JSON Example: Conversation Tree\n\nFor readability, only a subset of the message properties is shown here.\n\n\n\nPlease refer to oasst-data for\ndetails about the data structure and Python code to read and write jsonl files containing oasst data objects.",
"## Main Dataset Files\n\nConversation data is provided either as nested messages in trees (extension '.URL') \nor as a flat list (table) of messages (extension '.URL').",
"### Ready For Export Trees",
"#### 2023-11-05_oasst2_ready.URL Stats\n\n\nTrees in ready_for_export state without spam and deleted messages including message labels. The oasst_ready-trees file usually is sufficient for supervised fine-tuning (SFT) & reward model (RM) training.",
"### All Trees\n\n\n\nAll trees, including those in states prompt_lottery_waiting (trees that consist of only one message, namely the initial prompt), aborted_low_grade (trees that stopped growing because the messages had low quality), and halted_by_moderator.",
"#### 2023-11-05_oasst2_all.URL Stats",
"### Supplemental Exports: Spam & Prompts \n\n\n\nThese are messages which were deleted or have a negative review result (\"review_result\": false). Besides low quality, a frequent reason for message deletion is a wrong language tag.\n\n\n\nThese are all the kept initial prompt messages with positive review result (no spam) of trees in 'ready_for_export' or 'prompt_lottery_waiting' state.",
"### Using the Huggingface Datasets\n\nWhile HF datasets is ideal for tabular datasets, it is not a natural fit for nested data structures like the OpenAssistant conversation trees.\nNevertheless, we make all messages which can also be found in the file '2023-11-05_oasst2_ready.URL' available in parquet format as train/validation splits. \nThese are directly loadable by Huggingface Datasets.\n\nTo load the oasst2 train & validation splits use:\n\n\n\nThe messages appear in depth-first order of the message trees.\n\nFull conversation trees can be reconstructed from the flat messages table by using the 'parent_id' \nand 'message_id' properties to identify the parent-child relationship of messages. The 'message_tree_id' \nand 'tree_state' properties (only present in flat messages files) can be used to find all messages of a message tree or to select trees by their state.",
"### Data Visualisation\n\nExplore the content of the prompts from the English subset using Bunka open-source visualization technology. \nThe interactive map available on a HF space allows to explore each datapoint to get a more precise overview of the contents.\n\n<a href=\"https://i.URL\n <img src=\"https://i.URL alt=\"Bunka oasst2 Map\" width=\"35%\"/>\n</a>",
"## Contact\n\n- Discord Open Assistant Discord Server\n- GitHub: LAION-AI/Open-Assistant\n- E-Mail: open-assistant@URL"
] | [
235,
15,
18,
120,
51,
61,
46,
8,
72,
65,
16,
96,
223,
98,
36
] | [
"passage: TAGS\n#size_categories-100K<n<1M #language-English #language-Spanish #language-Russian #language-German #language-Polish #language-Thai #language-Vietnamese #language-Swedish #language-Bengali #language-Danish #language-Hebrew #language-Italian #language-Persian #language-Slovak #language-Indonesian #language-Norwegian Bokmål #language-Modern Greek (1453-) #language-Dutch #language-Hungarian #language-Basque #language-Chinese #language-Esperanto #language-Japanese #language-Catalan #language-Czech #language-Bulgarian #language-Finnish #language-Portuguese #language-Turkish #language-Romanian #language-Arabic #language-Ukrainian #language-Galician #language-French #language-Korean #license-apache-2.0 #human-feedback #arxiv-2304.07327 #region-us \n# Open Assistant Conversations Dataset Release 2 (OASST2)## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: URL### Dataset Structure\n\nThis dataset contains message trees. Each message tree has an initial prompt message as the root node, \nwhich can have multiple child messages as replies, and these child messages can have multiple replies. \n\nAll messages have a role property: this can either be \"assistant\" or \"prompter\". The roles in \nconversation threads from prompt to leaf node strictly alternate between \"prompter\" and \"assistant\".\n\nThis version of the dataset contains data collected on the URL website until Nov 5 2023.### JSON Example: Message\n\nFor readability, the following JSON examples are shown formatted with indentation on multiple lines.\nObjects are stored without indentation (on single lines) in the actual jsonl files.### JSON Example: Conversation Tree\n\nFor readability, only a subset of the message properties is shown here.\n\n\n\nPlease refer to oasst-data for\ndetails about the data structure and Python code to read and write jsonl files containing oasst data objects.",
"passage: ## Main Dataset Files\n\nConversation data is provided either as nested messages in trees (extension '.URL') \nor as a flat list (table) of messages (extension '.URL').### Ready For Export Trees#### 2023-11-05_oasst2_ready.URL Stats\n\n\nTrees in ready_for_export state without spam and deleted messages including message labels. The oasst_ready-trees file usually is sufficient for supervised fine-tuning (SFT) & reward model (RM) training.### All Trees\n\n\n\nAll trees, including those in states prompt_lottery_waiting (trees that consist of only one message, namely the initial prompt), aborted_low_grade (trees that stopped growing because the messages had low quality), and halted_by_moderator.#### 2023-11-05_oasst2_all.URL Stats### Supplemental Exports: Spam & Prompts \n\n\n\nThese are messages which were deleted or have a negative review result (\"review_result\": false). Besides low quality, a frequent reason for message deletion is a wrong language tag.\n\n\n\nThese are all the kept initial prompt messages with positive review result (no spam) of trees in 'ready_for_export' or 'prompt_lottery_waiting' state.### Using the Huggingface Datasets\n\nWhile HF datasets is ideal for tabular datasets, it is not a natural fit for nested data structures like the OpenAssistant conversation trees.\nNevertheless, we make all messages which can also be found in the file '2023-11-05_oasst2_ready.URL' available in parquet format as train/validation splits. \nThese are directly loadable by Huggingface Datasets.\n\nTo load the oasst2 train & validation splits use:\n\n\n\nThe messages appear in depth-first order of the message trees.\n\nFull conversation trees can be reconstructed from the flat messages table by using the 'parent_id' \nand 'message_id' properties to identify the parent-child relationship of messages. The 'message_tree_id' \nand 'tree_state' properties (only present in flat messages files) can be used to find all messages of a message tree or to select trees by their state."
] |
70e08063c5402bd237fb8f11ac7eaa68dd53e755 |
This dataset contains German exonyms for various places in modern day Poland, Czech Republic, Latvia, Lithuania and Estonia.
Exonym : - A placename that is used by people who are not locals. For example, Prague is the Eng. exonym of Czech capital Praha, or Cologne is an exonym for German city Köln.
Due to extensive historical German rule and presence over large chunks of modern day Poland and Czech republic, these two countries populate the dataset the most. | DebasishDhal99/German-Names-Middle-And-Eastern-Europe | [
"task_categories:translation",
"size_categories:10K<n<100K",
"language:de",
"language:pl",
"language:cs",
"language:lt",
"language:lv",
"language:et",
"language:sl",
"language:sk",
"license:mit",
"region:us"
] | 2023-12-24T09:55:59+00:00 | {"language": ["de", "pl", "cs", "lt", "lv", "et", "sl", "sk"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["translation"], "configs": [{"config_name": "polish_german", "data_files": [{"split": "train", "path": "german_polish.csv"}]}, {"config_name": "czech_german", "data_files": [{"split": "train", "path": "german_czech.csv"}]}, {"config_name": "lithuanian_german", "data_files": [{"split": "train", "path": "german_lithuanian.csv"}]}, {"config_name": "latvian_german", "data_files": [{"split": "train", "path": "german_latvian.csv"}]}, {"config_name": "estonian_german", "data_files": [{"split": "train", "path": "german_estonian.csv"}]}, {"config_name": "slovak_german", "data_files": [{"split": "train", "path": "german_slovak.csv"}]}, {"config_name": "slovene_german", "data_files": [{"split": "train", "path": "german_slovene.csv"}]}]} | 2024-01-03T11:09:38+00:00 | [] | [
"de",
"pl",
"cs",
"lt",
"lv",
"et",
"sl",
"sk"
] | TAGS
#task_categories-translation #size_categories-10K<n<100K #language-German #language-Polish #language-Czech #language-Lithuanian #language-Latvian #language-Estonian #language-Slovenian #language-Slovak #license-mit #region-us
|
This dataset contains German exonyms for various places in modern day Poland, Czech Republic, Latvia, Lithuania and Estonia.
Exonym : - A placename that is used by people who are not locals. For example, Prague is the Eng. exonym of Czech capital Praha, or Cologne is an exonym for German city Köln.
Due to extensive historical German rule and presence over large chunks of modern day Poland and Czech republic, these two countries populate the dataset the most. | [] | [
"TAGS\n#task_categories-translation #size_categories-10K<n<100K #language-German #language-Polish #language-Czech #language-Lithuanian #language-Latvian #language-Estonian #language-Slovenian #language-Slovak #license-mit #region-us \n"
] | [
76
] | [
"passage: TAGS\n#task_categories-translation #size_categories-10K<n<100K #language-German #language-Polish #language-Czech #language-Lithuanian #language-Latvian #language-Estonian #language-Slovenian #language-Slovak #license-mit #region-us \n"
] |
afd122458feb9f466f050d165a55ec8002ed41a7 |
# Introduction
This dataset represents a curated collection of parallel Arabic-English texts, featuring the translations of 24 historically and culturally significant books. These texts provide a portal to the intellectual and literary heritage of the Arabic-speaking world during its classical period.
# Content Details
Contained within this dataset are English translations of the following texts, sourced from the [Rasaif website](https://rasaif.com/):
- A Muslim Manual of War
- Al-Hanin Ila'l-Awtan
- Avarice and the Avaricious
- Contemplation
- Diseases of the Hearts and Their Cures
- Hayy ibn Yaqzan
- Ibn Khallikan's Biographical Dictionary
- Kitab al-I'tibar
- Knowledge Mandates Action
- Morals and Behaviour
- Nahj al-Balagha
- The Book of Strangers
- The Canon Of Medicine of Avicenna
- The Epistle on Legal Theory
- The Heavenly Dispute
- The Islamic Conquest of Syria
- The Journey of the Strangers
- The Key to Medicine and a Guide for Students
- The Muqaddimah: An Introduction to History
- The Optics of Ibn Al-Haytham
- The Rare and Excellent History of Saladin
- The Ring of the Dove
- The Strangers
- The Travels Of Ibn Battuta, 1325 – 1354
# Purpose and Application
The overarching objective of this dataset is to highlight the superior literary quality of Classical Arabic, which stands in stark contrast to the language's later developments, particularly due to the mass translations of European texts in the 19th and 20th centuries. It aims to:
- Refine Machine Translation (MT): With its intricate grammatical structure and rich lexicon, Classical Arabic presents an ideal challenge for MT systems, which, when honed on such high-caliber content, can achieve greater accuracy and fluency.
- Language Models: By incorporating texts of such linguistic finesse, this dataset becomes a cornerstone for developing Large Language Models (LLMs) that can grasp and replicate the sophistication inherent in Classical Arabic.
- Preserve Linguistic Heritage: This dataset acts as a conduit for preserving the exceptional literary form of Classical Arabic, providing a benchmark of quality against which contemporary writings can be measured.
# Suggested Research Application: Iterative Translation Refinement
A novel application for this dataset involves utilizing existing translation models to back-translate the English texts into Arabic, likely resulting in a less sophisticated form of the language. This process, known as back-translation, can generate a large corpus of imperfect Arabic text. Subsequently, a new model could be trained to refine this weaker form of Arabic by comparing it to the original Classical Arabic texts in the dataset. The resultant model can be used to enhance current Arabic texts by making it sound more "Classical".
# Credits
[The Rasaif Website](https://rasaif.com/): For updates and more information about their work, follow them on [Twitter](https://twitter.com/rasaif_com), and follow Ahmad Alghamdi's [Telegram channel](https://t.me/ahmedhassg)
| ImruQays/Rasaif-Classical-Arabic-English-Parallel-texts | [
"task_categories:translation",
"size_categories:10K<n<100K",
"language:ar",
"language:en",
"region:us"
] | 2023-12-24T11:21:17+00:00 | {"language": ["ar", "en"], "size_categories": ["10K<n<100K"], "task_categories": ["translation"]} | 2023-12-28T15:15:40+00:00 | [] | [
"ar",
"en"
] | TAGS
#task_categories-translation #size_categories-10K<n<100K #language-Arabic #language-English #region-us
|
# Introduction
This dataset represents a curated collection of parallel Arabic-English texts, featuring the translations of 24 historically and culturally significant books. These texts provide a portal to the intellectual and literary heritage of the Arabic-speaking world during its classical period.
# Content Details
Contained within this dataset are English translations of the following texts, sourced from the Rasaif website:
- A Muslim Manual of War
- Al-Hanin Ila'l-Awtan
- Avarice and the Avaricious
- Contemplation
- Diseases of the Hearts and Their Cures
- Hayy ibn Yaqzan
- Ibn Khallikan's Biographical Dictionary
- Kitab al-I'tibar
- Knowledge Mandates Action
- Morals and Behaviour
- Nahj al-Balagha
- The Book of Strangers
- The Canon Of Medicine of Avicenna
- The Epistle on Legal Theory
- The Heavenly Dispute
- The Islamic Conquest of Syria
- The Journey of the Strangers
- The Key to Medicine and a Guide for Students
- The Muqaddimah: An Introduction to History
- The Optics of Ibn Al-Haytham
- The Rare and Excellent History of Saladin
- The Ring of the Dove
- The Strangers
- The Travels Of Ibn Battuta, 1325 – 1354
# Purpose and Application
The overarching objective of this dataset is to highlight the superior literary quality of Classical Arabic, which stands in stark contrast to the language's later developments, particularly due to the mass translations of European texts in the 19th and 20th centuries. It aims to:
- Refine Machine Translation (MT): With its intricate grammatical structure and rich lexicon, Classical Arabic presents an ideal challenge for MT systems, which, when honed on such high-caliber content, can achieve greater accuracy and fluency.
- Language Models: By incorporating texts of such linguistic finesse, this dataset becomes a cornerstone for developing Large Language Models (LLMs) that can grasp and replicate the sophistication inherent in Classical Arabic.
- Preserve Linguistic Heritage: This dataset acts as a conduit for preserving the exceptional literary form of Classical Arabic, providing a benchmark of quality against which contemporary writings can be measured.
# Suggested Research Application: Iterative Translation Refinement
A novel application for this dataset involves utilizing existing translation models to back-translate the English texts into Arabic, likely resulting in a less sophisticated form of the language. This process, known as back-translation, can generate a large corpus of imperfect Arabic text. Subsequently, a new model could be trained to refine this weaker form of Arabic by comparing it to the original Classical Arabic texts in the dataset. The resultant model can be used to enhance current Arabic texts by making it sound more "Classical".
# Credits
The Rasaif Website: For updates and more information about their work, follow them on Twitter, and follow Ahmad Alghamdi's Telegram channel
| [
"# Introduction\n\nThis dataset represents a curated collection of parallel Arabic-English texts, featuring the translations of 24 historically and culturally significant books. These texts provide a portal to the intellectual and literary heritage of the Arabic-speaking world during its classical period.",
"# Content Details\n\nContained within this dataset are English translations of the following texts, sourced from the Rasaif website:\n\n- A Muslim Manual of War\n- Al-Hanin Ila'l-Awtan\n- Avarice and the Avaricious\n- Contemplation\n- Diseases of the Hearts and Their Cures\n- Hayy ibn Yaqzan\n- Ibn Khallikan's Biographical Dictionary\n- Kitab al-I'tibar\n- Knowledge Mandates Action\n- Morals and Behaviour\n- Nahj al-Balagha\n- The Book of Strangers\n- The Canon Of Medicine of Avicenna\n- The Epistle on Legal Theory\n- The Heavenly Dispute\n- The Islamic Conquest of Syria\n- The Journey of the Strangers\n- The Key to Medicine and a Guide for Students\n- The Muqaddimah: An Introduction to History\n- The Optics of Ibn Al-Haytham\n- The Rare and Excellent History of Saladin\n- The Ring of the Dove\n- The Strangers\n- The Travels Of Ibn Battuta, 1325 – 1354",
"# Purpose and Application\n\nThe overarching objective of this dataset is to highlight the superior literary quality of Classical Arabic, which stands in stark contrast to the language's later developments, particularly due to the mass translations of European texts in the 19th and 20th centuries. It aims to:\n\n- Refine Machine Translation (MT): With its intricate grammatical structure and rich lexicon, Classical Arabic presents an ideal challenge for MT systems, which, when honed on such high-caliber content, can achieve greater accuracy and fluency.\n\n- Language Models: By incorporating texts of such linguistic finesse, this dataset becomes a cornerstone for developing Large Language Models (LLMs) that can grasp and replicate the sophistication inherent in Classical Arabic.\n\n- Preserve Linguistic Heritage: This dataset acts as a conduit for preserving the exceptional literary form of Classical Arabic, providing a benchmark of quality against which contemporary writings can be measured.",
"# Suggested Research Application: Iterative Translation Refinement\n\nA novel application for this dataset involves utilizing existing translation models to back-translate the English texts into Arabic, likely resulting in a less sophisticated form of the language. This process, known as back-translation, can generate a large corpus of imperfect Arabic text. Subsequently, a new model could be trained to refine this weaker form of Arabic by comparing it to the original Classical Arabic texts in the dataset. The resultant model can be used to enhance current Arabic texts by making it sound more \"Classical\".",
"# Credits\nThe Rasaif Website: For updates and more information about their work, follow them on Twitter, and follow Ahmad Alghamdi's Telegram channel"
] | [
"TAGS\n#task_categories-translation #size_categories-10K<n<100K #language-Arabic #language-English #region-us \n",
"# Introduction\n\nThis dataset represents a curated collection of parallel Arabic-English texts, featuring the translations of 24 historically and culturally significant books. These texts provide a portal to the intellectual and literary heritage of the Arabic-speaking world during its classical period.",
"# Content Details\n\nContained within this dataset are English translations of the following texts, sourced from the Rasaif website:\n\n- A Muslim Manual of War\n- Al-Hanin Ila'l-Awtan\n- Avarice and the Avaricious\n- Contemplation\n- Diseases of the Hearts and Their Cures\n- Hayy ibn Yaqzan\n- Ibn Khallikan's Biographical Dictionary\n- Kitab al-I'tibar\n- Knowledge Mandates Action\n- Morals and Behaviour\n- Nahj al-Balagha\n- The Book of Strangers\n- The Canon Of Medicine of Avicenna\n- The Epistle on Legal Theory\n- The Heavenly Dispute\n- The Islamic Conquest of Syria\n- The Journey of the Strangers\n- The Key to Medicine and a Guide for Students\n- The Muqaddimah: An Introduction to History\n- The Optics of Ibn Al-Haytham\n- The Rare and Excellent History of Saladin\n- The Ring of the Dove\n- The Strangers\n- The Travels Of Ibn Battuta, 1325 – 1354",
"# Purpose and Application\n\nThe overarching objective of this dataset is to highlight the superior literary quality of Classical Arabic, which stands in stark contrast to the language's later developments, particularly due to the mass translations of European texts in the 19th and 20th centuries. It aims to:\n\n- Refine Machine Translation (MT): With its intricate grammatical structure and rich lexicon, Classical Arabic presents an ideal challenge for MT systems, which, when honed on such high-caliber content, can achieve greater accuracy and fluency.\n\n- Language Models: By incorporating texts of such linguistic finesse, this dataset becomes a cornerstone for developing Large Language Models (LLMs) that can grasp and replicate the sophistication inherent in Classical Arabic.\n\n- Preserve Linguistic Heritage: This dataset acts as a conduit for preserving the exceptional literary form of Classical Arabic, providing a benchmark of quality against which contemporary writings can be measured.",
"# Suggested Research Application: Iterative Translation Refinement\n\nA novel application for this dataset involves utilizing existing translation models to back-translate the English texts into Arabic, likely resulting in a less sophisticated form of the language. This process, known as back-translation, can generate a large corpus of imperfect Arabic text. Subsequently, a new model could be trained to refine this weaker form of Arabic by comparing it to the original Classical Arabic texts in the dataset. The resultant model can be used to enhance current Arabic texts by making it sound more \"Classical\".",
"# Credits\nThe Rasaif Website: For updates and more information about their work, follow them on Twitter, and follow Ahmad Alghamdi's Telegram channel"
] | [
36,
64,
229,
227,
137,
32
] | [
"passage: TAGS\n#task_categories-translation #size_categories-10K<n<100K #language-Arabic #language-English #region-us \n# Introduction\n\nThis dataset represents a curated collection of parallel Arabic-English texts, featuring the translations of 24 historically and culturally significant books. These texts provide a portal to the intellectual and literary heritage of the Arabic-speaking world during its classical period.# Content Details\n\nContained within this dataset are English translations of the following texts, sourced from the Rasaif website:\n\n- A Muslim Manual of War\n- Al-Hanin Ila'l-Awtan\n- Avarice and the Avaricious\n- Contemplation\n- Diseases of the Hearts and Their Cures\n- Hayy ibn Yaqzan\n- Ibn Khallikan's Biographical Dictionary\n- Kitab al-I'tibar\n- Knowledge Mandates Action\n- Morals and Behaviour\n- Nahj al-Balagha\n- The Book of Strangers\n- The Canon Of Medicine of Avicenna\n- The Epistle on Legal Theory\n- The Heavenly Dispute\n- The Islamic Conquest of Syria\n- The Journey of the Strangers\n- The Key to Medicine and a Guide for Students\n- The Muqaddimah: An Introduction to History\n- The Optics of Ibn Al-Haytham\n- The Rare and Excellent History of Saladin\n- The Ring of the Dove\n- The Strangers\n- The Travels Of Ibn Battuta, 1325 – 1354"
] |
0278c6e093a5a8ba64aa87a28be9f1139f3b60f4 | # Dataset Card for "bbc_news_alltime"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | RealTimeData/bbc_news_alltime | [
"region:us"
] | 2023-12-24T11:32:33+00:00 | {"dataset_info": [{"config_name": "2017-01", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5574520, "num_examples": 1688}], "download_size": 0, "dataset_size": 5574520}, {"config_name": "2017-02", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5013358, "num_examples": 1469}], "download_size": 2533589, "dataset_size": 5013358}, {"config_name": "2017-03", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3454177, "num_examples": 721}], "download_size": 1456354, "dataset_size": 3454177}, {"config_name": "2017-04", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3759656, "num_examples": 807}], "download_size": 1573085, "dataset_size": 3759656}, {"config_name": "2017-05", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3656616, "num_examples": 756}], "download_size": 1577606, "dataset_size": 3656616}, {"config_name": "2017-06", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4546752, "num_examples": 1106}], "download_size": 2055760, "dataset_size": 4546752}, {"config_name": "2017-07", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4669023, "num_examples": 1139}], "download_size": 2220913, "dataset_size": 4669023}, {"config_name": "2017-08", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4529387, "num_examples": 1113}], "download_size": 2053558, "dataset_size": 4529387}, {"config_name": "2017-09", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4950651, "num_examples": 1199}], "download_size": 2406134, "dataset_size": 4950651}, {"config_name": "2017-10", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4900443, "num_examples": 1187}], "download_size": 2344203, "dataset_size": 4900443}, {"config_name": "2017-11", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5141607, "num_examples": 1443}], "download_size": 2535360, "dataset_size": 5141607}, {"config_name": "2017-12", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4273797, "num_examples": 1294}], "download_size": 2074041, "dataset_size": 4273797}, {"config_name": "2018-01", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4789841, "num_examples": 1323}], "download_size": 0, "dataset_size": 4789841}, {"config_name": "2018-02", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4174594, "num_examples": 1223}], "download_size": 1922883, "dataset_size": 4174594}, {"config_name": "2018-03", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4550223, "num_examples": 1280}], "download_size": 2193369, "dataset_size": 4550223}, {"config_name": "2018-04", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4646713, "num_examples": 1328}], "download_size": 0, "dataset_size": 4646713}, {"config_name": "2018-05", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4549377, "num_examples": 1334}], "download_size": 0, "dataset_size": 4549377}, {"config_name": "2018-06", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4416735, "num_examples": 1189}], "download_size": 2050298, "dataset_size": 4416735}, {"config_name": "2018-07", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5677193, "num_examples": 1496}], "download_size": 0, "dataset_size": 5677193}, {"config_name": "2018-08", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4346176, "num_examples": 1253}], "download_size": 2051252, "dataset_size": 4346176}, {"config_name": "2018-09", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4299146, "num_examples": 1277}], "download_size": 2067971, "dataset_size": 4299146}, {"config_name": "2018-10", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4207852, "num_examples": 1249}], "download_size": 1992203, "dataset_size": 4207852}, {"config_name": "2018-11", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4390888, "num_examples": 1290}], "download_size": 2117715, "dataset_size": 4390888}, {"config_name": "2018-12", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3725672, "num_examples": 1138}], "download_size": 1703129, "dataset_size": 3725672}, {"config_name": "2019-01", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4299425, "num_examples": 1240}], "download_size": 2076680, "dataset_size": 4299425}, {"config_name": "2019-02", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4403481, "num_examples": 1214}], "download_size": 2138193, "dataset_size": 4403481}, {"config_name": "2019-03", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4758117, "num_examples": 1333}], "download_size": 2336195, "dataset_size": 4758117}, {"config_name": "2019-04", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4691658, "num_examples": 1280}], "download_size": 2280145, "dataset_size": 4691658}, {"config_name": "2019-05", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4809409, "num_examples": 1369}], "download_size": 2423627, "dataset_size": 4809409}, {"config_name": "2019-06", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4971344, "num_examples": 1348}], "download_size": 2439729, "dataset_size": 4971344}, {"config_name": "2019-07", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5114465, "num_examples": 1366}], "download_size": 2547598, "dataset_size": 5114465}, {"config_name": "2019-08", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4379278, "num_examples": 1219}], "download_size": 2080813, "dataset_size": 4379278}, {"config_name": "2019-09", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4784664, "num_examples": 1256}], "download_size": 2267891, "dataset_size": 4784664}, {"config_name": "2019-10", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4805548, "num_examples": 1271}], "download_size": 2314075, "dataset_size": 4805548}, {"config_name": "2019-11", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4665346, "num_examples": 1275}], "download_size": 2241667, "dataset_size": 4665346}, {"config_name": "2019-12", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4766654, "num_examples": 1304}], "download_size": 2240533, "dataset_size": 4766654}, {"config_name": "2020-01", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4693399, "num_examples": 1230}], "download_size": 2249724, "dataset_size": 4693399}, {"config_name": "2020-02", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4456312, "num_examples": 1197}], "download_size": 2111991, "dataset_size": 4456312}, {"config_name": "2020-03", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4188579, "num_examples": 1156}], "download_size": 1921306, "dataset_size": 4188579}, {"config_name": "2020-04", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4280469, "num_examples": 1152}], "download_size": 1864282, "dataset_size": 4280469}, {"config_name": "2020-05", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4709875, "num_examples": 1257}], "download_size": 2250585, "dataset_size": 4709875}, {"config_name": "2020-06", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4890877, "num_examples": 1231}], "download_size": 2339433, "dataset_size": 4890877}, {"config_name": "2020-07", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4895721, "num_examples": 1302}], "download_size": 2466602, "dataset_size": 4895721}, {"config_name": "2020-08", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4740067, "num_examples": 1240}], "download_size": 2301105, "dataset_size": 4740067}, {"config_name": "2020-09", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4609527, "num_examples": 1199}], "download_size": 2215523, "dataset_size": 4609527}, {"config_name": "2020-10", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5077617, "num_examples": 1298}], "download_size": 2468054, "dataset_size": 5077617}, {"config_name": "2020-11", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5140934, "num_examples": 1297}], "download_size": 2550717, "dataset_size": 5140934}, {"config_name": "2020-12", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4704766, "num_examples": 1186}], "download_size": 2228502, "dataset_size": 4704766}, {"config_name": "2021-01", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5788543, "num_examples": 1365}], "download_size": 2802958, "dataset_size": 5788543}, {"config_name": "2021-02", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5566915, "num_examples": 1368}], "download_size": 2782746, "dataset_size": 5566915}, {"config_name": "2021-03", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5442120, "num_examples": 1321}], "download_size": 2714031, "dataset_size": 5442120}, {"config_name": "2021-04", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5428458, "num_examples": 1320}], "download_size": 2608886, "dataset_size": 5428458}, {"config_name": "2021-05", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5459942, "num_examples": 1264}], "download_size": 2678492, "dataset_size": 5459942}, {"config_name": "2021-06", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5684472, "num_examples": 1367}], "download_size": 2845555, "dataset_size": 5684472}, {"config_name": "2021-07", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6015721, "num_examples": 1486}], "download_size": 0, "dataset_size": 6015721}, {"config_name": "2021-08", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5237163, "num_examples": 1381}], "download_size": 2520550, "dataset_size": 5237163}, {"config_name": "2021-09", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5787591, "num_examples": 1429}], "download_size": 2964644, "dataset_size": 5787591}, {"config_name": "2021-10", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5951443, "num_examples": 1474}], "download_size": 0, "dataset_size": 5951443}, {"config_name": "2021-11", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6156073, "num_examples": 1461}], "download_size": 3072907, "dataset_size": 6156073}, {"config_name": "2021-12", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5669496, "num_examples": 1344}], "download_size": 2737609, "dataset_size": 5669496}, {"config_name": "2022-01", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5772649, "num_examples": 1404}], "download_size": 2775239, "dataset_size": 5772649}, {"config_name": "2022-02", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5978585, "num_examples": 1405}], "download_size": 2998444, "dataset_size": 5978585}, {"config_name": "2022-03", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6155116, "num_examples": 1440}], "download_size": 2846323, "dataset_size": 6155116}, {"config_name": "2022-04", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5990391, "num_examples": 1436}], "download_size": 2845665, "dataset_size": 5990391}, {"config_name": "2022-05", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5731497, "num_examples": 1357}], "download_size": 2771401, "dataset_size": 5731497}, {"config_name": "2022-06", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6193465, "num_examples": 1479}], "download_size": 3050919, "dataset_size": 6193465}, {"config_name": "2022-07", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5952295, "num_examples": 1445}], "download_size": 3005257, "dataset_size": 5952295}, {"config_name": "2022-08", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5202318, "num_examples": 1281}], "download_size": 2554877, "dataset_size": 5202318}, {"config_name": "2022-09", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6475630, "num_examples": 1538}], "download_size": 3116639, "dataset_size": 6475630}, {"config_name": "2022-10", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5720095, "num_examples": 1394}], "download_size": 2833046, "dataset_size": 5720095}, {"config_name": "2022-11", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6746726, "num_examples": 1630}], "download_size": 0, "dataset_size": 6746726}, {"config_name": "2022-12", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6503786, "num_examples": 1647}], "download_size": 3259667, "dataset_size": 6503786}, {"config_name": "2023-01", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6581264, "num_examples": 1623}], "download_size": 3294354, "dataset_size": 6581264}, {"config_name": "2023-02", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6833602, "num_examples": 1588}], "download_size": 3372795, "dataset_size": 6833602}, {"config_name": "2023-03", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6496844, "num_examples": 1590}], "download_size": 0, "dataset_size": 6496844}, {"config_name": "2023-04", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6929455, "num_examples": 1672}], "download_size": 3485685, "dataset_size": 6929455}, {"config_name": "2023-05", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7189370, "num_examples": 1746}], "download_size": 3613049, "dataset_size": 7189370}, {"config_name": "2023-06", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6890616, "num_examples": 1674}], "download_size": 3430482, "dataset_size": 6890616}, {"config_name": "2023-07", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6886749, "num_examples": 1694}], "download_size": 0, "dataset_size": 6886749}, {"config_name": "2023-08", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7000778, "num_examples": 1715}], "download_size": 3433271, "dataset_size": 7000778}, {"config_name": "2023-09", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6672924, "num_examples": 1661}], "download_size": 3377990, "dataset_size": 6672924}, {"config_name": "2023-10", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7057042, "num_examples": 1680}], "download_size": 3400238, "dataset_size": 7057042}, {"config_name": "2023-11", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6948193, "num_examples": 1575}], "download_size": 3263773, "dataset_size": 6948193}, {"config_name": "2023-12", "features": [{"name": "title", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "description", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "link", "dtype": "string"}, {"name": "top_image", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6295385, "num_examples": 1460}], "download_size": 3029041, "dataset_size": 6295385}], "configs": [{"config_name": "2017-01", "data_files": [{"split": "train", "path": "2017-01/train-*"}]}, {"config_name": "2017-02", "data_files": [{"split": "train", "path": "2017-02/train-*"}]}, {"config_name": "2017-03", "data_files": [{"split": "train", "path": "2017-03/train-*"}]}, {"config_name": "2017-04", "data_files": [{"split": "train", "path": "2017-04/train-*"}]}, {"config_name": "2017-05", "data_files": [{"split": "train", "path": "2017-05/train-*"}]}, {"config_name": "2017-06", "data_files": [{"split": "train", "path": "2017-06/train-*"}]}, {"config_name": "2017-07", "data_files": [{"split": "train", "path": "2017-07/train-*"}]}, {"config_name": "2017-08", "data_files": [{"split": "train", "path": "2017-08/train-*"}]}, {"config_name": "2017-09", "data_files": [{"split": "train", "path": "2017-09/train-*"}]}, {"config_name": "2017-10", "data_files": [{"split": "train", "path": "2017-10/train-*"}]}, {"config_name": "2017-11", "data_files": [{"split": "train", "path": "2017-11/train-*"}]}, {"config_name": "2017-12", "data_files": [{"split": "train", "path": "2017-12/train-*"}]}, {"config_name": "2018-01", "data_files": [{"split": "train", "path": "2018-01/train-*"}]}, {"config_name": "2018-02", "data_files": [{"split": "train", "path": "2018-02/train-*"}]}, {"config_name": "2018-03", "data_files": [{"split": "train", "path": "2018-03/train-*"}]}, {"config_name": "2018-04", "data_files": [{"split": "train", "path": "2018-04/train-*"}]}, {"config_name": "2018-05", "data_files": [{"split": "train", "path": "2018-05/train-*"}]}, {"config_name": "2018-06", "data_files": [{"split": "train", "path": "2018-06/train-*"}]}, {"config_name": "2018-07", "data_files": [{"split": "train", "path": "2018-07/train-*"}]}, {"config_name": "2018-08", "data_files": [{"split": "train", "path": "2018-08/train-*"}]}, {"config_name": "2018-09", "data_files": [{"split": "train", "path": "2018-09/train-*"}]}, {"config_name": "2018-10", "data_files": [{"split": "train", "path": "2018-10/train-*"}]}, {"config_name": "2018-11", "data_files": [{"split": "train", "path": "2018-11/train-*"}]}, {"config_name": "2018-12", "data_files": [{"split": "train", "path": "2018-12/train-*"}]}, {"config_name": "2019-01", "data_files": [{"split": "train", "path": "2019-01/train-*"}]}, {"config_name": "2019-02", "data_files": [{"split": "train", "path": "2019-02/train-*"}]}, {"config_name": "2019-03", "data_files": [{"split": "train", "path": "2019-03/train-*"}]}, {"config_name": "2019-04", "data_files": [{"split": "train", "path": "2019-04/train-*"}]}, {"config_name": "2019-05", "data_files": [{"split": "train", "path": "2019-05/train-*"}]}, {"config_name": "2019-06", "data_files": [{"split": "train", "path": "2019-06/train-*"}]}, {"config_name": "2019-07", "data_files": [{"split": "train", "path": "2019-07/train-*"}]}, {"config_name": "2019-08", "data_files": [{"split": "train", "path": "2019-08/train-*"}]}, {"config_name": "2019-09", "data_files": [{"split": "train", "path": "2019-09/train-*"}]}, {"config_name": "2019-10", "data_files": [{"split": "train", "path": "2019-10/train-*"}]}, {"config_name": "2019-11", "data_files": [{"split": "train", "path": "2019-11/train-*"}]}, {"config_name": "2019-12", "data_files": [{"split": "train", "path": "2019-12/train-*"}]}, {"config_name": "2020-01", "data_files": [{"split": "train", "path": "2020-01/train-*"}]}, {"config_name": "2020-02", "data_files": [{"split": "train", "path": "2020-02/train-*"}]}, {"config_name": "2020-03", "data_files": [{"split": "train", "path": "2020-03/train-*"}]}, {"config_name": "2020-04", "data_files": [{"split": "train", "path": "2020-04/train-*"}]}, {"config_name": "2020-05", "data_files": [{"split": "train", "path": "2020-05/train-*"}]}, {"config_name": "2020-06", "data_files": [{"split": "train", "path": "2020-06/train-*"}]}, {"config_name": "2020-07", "data_files": [{"split": "train", "path": "2020-07/train-*"}]}, {"config_name": "2020-08", "data_files": [{"split": "train", "path": "2020-08/train-*"}]}, {"config_name": "2020-09", "data_files": [{"split": "train", "path": "2020-09/train-*"}]}, {"config_name": "2020-10", "data_files": [{"split": "train", "path": "2020-10/train-*"}]}, {"config_name": "2020-11", "data_files": [{"split": "train", "path": "2020-11/train-*"}]}, {"config_name": "2020-12", "data_files": [{"split": "train", "path": "2020-12/train-*"}]}, {"config_name": "2021-01", "data_files": [{"split": "train", "path": "2021-01/train-*"}]}, {"config_name": "2021-02", "data_files": [{"split": "train", "path": "2021-02/train-*"}]}, {"config_name": "2021-03", "data_files": [{"split": "train", "path": "2021-03/train-*"}]}, {"config_name": "2021-04", "data_files": [{"split": "train", "path": "2021-04/train-*"}]}, {"config_name": "2021-05", "data_files": [{"split": "train", "path": "2021-05/train-*"}]}, {"config_name": "2021-06", "data_files": [{"split": "train", "path": "2021-06/train-*"}]}, {"config_name": "2021-07", "data_files": [{"split": "train", "path": "2021-07/train-*"}]}, {"config_name": "2021-08", "data_files": [{"split": "train", "path": "2021-08/train-*"}]}, {"config_name": "2021-09", "data_files": [{"split": "train", "path": "2021-09/train-*"}]}, {"config_name": "2021-10", "data_files": [{"split": "train", "path": "2021-10/train-*"}]}, {"config_name": "2021-11", "data_files": [{"split": "train", "path": "2021-11/train-*"}]}, {"config_name": "2021-12", "data_files": [{"split": "train", "path": "2021-12/train-*"}]}, {"config_name": "2022-01", "data_files": [{"split": "train", "path": "2022-01/train-*"}]}, {"config_name": "2022-02", "data_files": [{"split": "train", "path": "2022-02/train-*"}]}, {"config_name": "2022-03", "data_files": [{"split": "train", "path": "2022-03/train-*"}]}, {"config_name": "2022-04", "data_files": [{"split": "train", "path": "2022-04/train-*"}]}, {"config_name": "2022-05", "data_files": [{"split": "train", "path": "2022-05/train-*"}]}, {"config_name": "2022-06", "data_files": [{"split": "train", "path": "2022-06/train-*"}]}, {"config_name": "2022-07", "data_files": [{"split": "train", "path": "2022-07/train-*"}]}, {"config_name": "2022-08", "data_files": [{"split": "train", "path": "2022-08/train-*"}]}, {"config_name": "2022-09", "data_files": [{"split": "train", "path": "2022-09/train-*"}]}, {"config_name": "2022-10", "data_files": [{"split": "train", "path": "2022-10/train-*"}]}, {"config_name": "2022-11", "data_files": [{"split": "train", "path": "2022-11/train-*"}]}, {"config_name": "2022-12", "data_files": [{"split": "train", "path": "2022-12/train-*"}]}, {"config_name": "2023-01", "data_files": [{"split": "train", "path": "2023-01/train-*"}]}, {"config_name": "2023-02", "data_files": [{"split": "train", "path": "2023-02/train-*"}]}, {"config_name": "2023-03", "data_files": [{"split": "train", "path": "2023-03/train-*"}]}, {"config_name": "2023-04", "data_files": [{"split": "train", "path": "2023-04/train-*"}]}, {"config_name": "2023-05", "data_files": [{"split": "train", "path": "2023-05/train-*"}]}, {"config_name": "2023-06", "data_files": [{"split": "train", "path": "2023-06/train-*"}]}, {"config_name": "2023-07", "data_files": [{"split": "train", "path": "2023-07/train-*"}]}, {"config_name": "2023-08", "data_files": [{"split": "train", "path": "2023-08/train-*"}]}, {"config_name": "2023-09", "data_files": [{"split": "train", "path": "2023-09/train-*"}]}, {"config_name": "2023-10", "data_files": [{"split": "train", "path": "2023-10/train-*"}]}, {"config_name": "2023-11", "data_files": [{"split": "train", "path": "2023-11/train-*"}]}, {"config_name": "2023-12", "data_files": [{"split": "train", "path": "2023-12/train-*"}]}]} | 2023-12-26T20:58:49+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "bbc_news_alltime"
More Information needed | [
"# Dataset Card for \"bbc_news_alltime\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"bbc_news_alltime\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"bbc_news_alltime\"\n\nMore Information needed"
] |
769a58e0e8b7b09210ca9116189c4cdf393524d9 | # Dataset Card for "mkb_hi_bn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | sudeshna84/mkb_hi_bn | [
"region:us"
] | 2023-12-24T12:50:31+00:00 | {"dataset_info": {"features": [{"name": "hi", "dtype": "string"}, {"name": "bn", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3600.5454545454545, "num_examples": 7}, {"name": "test", "num_bytes": 2057.4545454545455, "num_examples": 4}], "download_size": 11344, "dataset_size": 5658.0}} | 2023-12-24T12:57:37+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "mkb_hi_bn"
More Information needed | [
"# Dataset Card for \"mkb_hi_bn\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"mkb_hi_bn\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"mkb_hi_bn\"\n\nMore Information needed"
] |
e25c91bc31859606507a968559ab1de0f472d007 |
[Under Construction]
This is a repository containing all the queries from the Japanese part of the MMarco dataset, the multilingual version of the MSMarco dataset.
For each query, there are matching hard negatives:
- 25 of them retrieved by the multilingual e5 base model.
- Up to 10 of them retrieved by the basic implementation of BM25 from Japanese in the Anserini library. | bclavie/mmarco-japanese-hard-negatives | [
"task_categories:text-retrieval",
"language:ja",
"region:us"
] | 2023-12-24T13:04:26+00:00 | {"language": ["ja"], "task_categories": ["text-retrieval"], "dataset_info": {"features": [{"name": "query", "dtype": "string"}, {"name": "positives", "sequence": "string"}, {"name": "negatives", "sequence": "string"}, {"name": "bm25_negatives", "sequence": "string"}, {"name": "original_negatives", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 24494938913, "num_examples": 391061}], "download_size": 11664534369, "dataset_size": 24494938913}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-24T18:52:04+00:00 | [] | [
"ja"
] | TAGS
#task_categories-text-retrieval #language-Japanese #region-us
|
[Under Construction]
This is a repository containing all the queries from the Japanese part of the MMarco dataset, the multilingual version of the MSMarco dataset.
For each query, there are matching hard negatives:
- 25 of them retrieved by the multilingual e5 base model.
- Up to 10 of them retrieved by the basic implementation of BM25 from Japanese in the Anserini library. | [] | [
"TAGS\n#task_categories-text-retrieval #language-Japanese #region-us \n"
] | [
24
] | [
"passage: TAGS\n#task_categories-text-retrieval #language-Japanese #region-us \n"
] |
7db58997ba226f9a21fe5dcd5e86a409de9cd4c4 |
# Introduction
This dataset represents a comprehensive collection of parallel Arabic-English texts from the Thaqalayn Hadith Library, a premier source for exploring the classical hadith tradition of the Imāmī Shia Muslim school of thought. The library focuses on making primary historical sources accessible, serving as a bridge between past wisdom and contemporary study. The dataset features translations of significant classical Imāmī hadith texts, allowing for a deep dive into the linguistic and cultural heritage of this era.
# Content Details
The Thaqalayn Hadith Library includes Arabic-English parallel texts from the following classical collections:
- Al-Kāfi (The Sufficient)
- Muʿjam al-Aḥādīth al-Muʿtabara (A Comprehensive Compilation of Reliable Narrations)
- Al-Khiṣāl (The Book of Characteristics)
- ʿUyūn akhbār al-Riḍā (The Source of Traditions on Imam al-Riḍā)
- Al-Amālī (The Dictations) by Shaykh Muḥammad b. Muḥammad al-Mufīd
- Al-Amālī (The Dictations) by Shaykh Muḥammad b. ʿAlī al-Ṣaduq
- Al-Tawḥīd (The Book of Divine Unity)
- Kitāb al-Ḍuʿafāʾ (The Weakened Ones)
- Kitāb al-Ghayba (The Book of Occultation) by Abū ʿAbd Allah Muḥammad b. Ibrāhīm al-Nuʿmānī
- Kitāb al-Ghayba (The Book of Occultation) by Shaykh Muḥammad b. al-Ḥasan al-Ṭūsī
- Thawāb al-Aʿmāl wa ʿiqāb al-Aʿmāl (The Rewards & Punishments of Deeds)
- Kāmil al-Ziyārāt (The Complete Pilgrimage Guide)
- Faḍaʾil al-Shīʿa (Virtues of the Shīʿa)
- Ṣifāt al-Shīʿa (Attributes of the Shīʿa)
- Maʿānī al-ʾAkhbār (The Meanings of Reports)
- Kitāb al-Muʾmin (The Book of the Believer)
- Kitāb al-Zuhd (The Book of Asceticism)
- Nahj al-Balāgha (The Peak of Eloquence)
# Purpose and Application
The dataset aims to showcase the unmatched literary quality of Classical Arabic, distinguished from Modern Standard Arabic, particularly in its preservation from the European translation trends of the 19th and 20th centuries:
- Refinement of Machine Translation (MT): The complex grammatical structures and rich lexicon of Classical Arabic present a unique challenge for MT systems, pushing the boundaries of translation accuracy and fluency.
- Development of Language Models: These texts serve as a foundation for training sophisticated Large Language Models (LLMs) capable of understanding and replicating the depth of Classical Arabic.
- Preservation of Linguistic Heritage: This dataset preserves the original form of Classical Arabic, providing a standard of excellence against which modern writings can be compared.
# Suggested Research Application: Iterative Translation Refinement
A notable application of this dataset is the enhancement of contemporary Arabic writing through back-translation. Existing models can back-translate English texts into Arabic, potentially producing a less sophisticated form. This offers an opportunity to:
- Generate Imperfect Arabic Corpus: Use back-translation to create a corpus of Arabic text that is less refined than the original Classical Arabic.
- Train Refinement Models: Develop models that refine the imperfect Arabic by comparing it to the original texts, aiming to restore the classical eloquence.
- Enhance Contemporary Arabic Writing: Apply these models to modern Arabic texts, elevating their literary quality by infusing classical stylistic elements, making the language resonate with its classical roots.
# Credits
Credits go the [Thaqalayn website](https://thaqalayn.net/) for their compilation of Arabic and English texts. Also, the original webscrape is done by [jenusi](https://github.com/jenusi) on GitHub in this [repo](https://github.com/jenusi/ThaqalaynScraper). I only compiled it in the form of two columns for texts from all books. I also converted the numbers from western Arabic (0123456789) to eastern Arabic (٠١٢٣٤٥٦٧٨٩). | ImruQays/Thaqalayn-Classical-Arabic-English-Parallel-texts | [
"task_categories:translation",
"size_categories:10K<n<100K",
"language:ar",
"language:en",
"region:us"
] | 2023-12-24T13:37:04+00:00 | {"language": ["ar", "en"], "size_categories": ["10K<n<100K"], "task_categories": ["translation"]} | 2023-12-24T14:46:48+00:00 | [] | [
"ar",
"en"
] | TAGS
#task_categories-translation #size_categories-10K<n<100K #language-Arabic #language-English #region-us
|
# Introduction
This dataset represents a comprehensive collection of parallel Arabic-English texts from the Thaqalayn Hadith Library, a premier source for exploring the classical hadith tradition of the Imāmī Shia Muslim school of thought. The library focuses on making primary historical sources accessible, serving as a bridge between past wisdom and contemporary study. The dataset features translations of significant classical Imāmī hadith texts, allowing for a deep dive into the linguistic and cultural heritage of this era.
# Content Details
The Thaqalayn Hadith Library includes Arabic-English parallel texts from the following classical collections:
- Al-Kāfi (The Sufficient)
- Muʿjam al-Aḥādīth al-Muʿtabara (A Comprehensive Compilation of Reliable Narrations)
- Al-Khiṣāl (The Book of Characteristics)
- ʿUyūn akhbār al-Riḍā (The Source of Traditions on Imam al-Riḍā)
- Al-Amālī (The Dictations) by Shaykh Muḥammad b. Muḥammad al-Mufīd
- Al-Amālī (The Dictations) by Shaykh Muḥammad b. ʿAlī al-Ṣaduq
- Al-Tawḥīd (The Book of Divine Unity)
- Kitāb al-Ḍuʿafāʾ (The Weakened Ones)
- Kitāb al-Ghayba (The Book of Occultation) by Abū ʿAbd Allah Muḥammad b. Ibrāhīm al-Nuʿmānī
- Kitāb al-Ghayba (The Book of Occultation) by Shaykh Muḥammad b. al-Ḥasan al-Ṭūsī
- Thawāb al-Aʿmāl wa ʿiqāb al-Aʿmāl (The Rewards & Punishments of Deeds)
- Kāmil al-Ziyārāt (The Complete Pilgrimage Guide)
- Faḍaʾil al-Shīʿa (Virtues of the Shīʿa)
- Ṣifāt al-Shīʿa (Attributes of the Shīʿa)
- Maʿānī al-ʾAkhbār (The Meanings of Reports)
- Kitāb al-Muʾmin (The Book of the Believer)
- Kitāb al-Zuhd (The Book of Asceticism)
- Nahj al-Balāgha (The Peak of Eloquence)
# Purpose and Application
The dataset aims to showcase the unmatched literary quality of Classical Arabic, distinguished from Modern Standard Arabic, particularly in its preservation from the European translation trends of the 19th and 20th centuries:
- Refinement of Machine Translation (MT): The complex grammatical structures and rich lexicon of Classical Arabic present a unique challenge for MT systems, pushing the boundaries of translation accuracy and fluency.
- Development of Language Models: These texts serve as a foundation for training sophisticated Large Language Models (LLMs) capable of understanding and replicating the depth of Classical Arabic.
- Preservation of Linguistic Heritage: This dataset preserves the original form of Classical Arabic, providing a standard of excellence against which modern writings can be compared.
# Suggested Research Application: Iterative Translation Refinement
A notable application of this dataset is the enhancement of contemporary Arabic writing through back-translation. Existing models can back-translate English texts into Arabic, potentially producing a less sophisticated form. This offers an opportunity to:
- Generate Imperfect Arabic Corpus: Use back-translation to create a corpus of Arabic text that is less refined than the original Classical Arabic.
- Train Refinement Models: Develop models that refine the imperfect Arabic by comparing it to the original texts, aiming to restore the classical eloquence.
- Enhance Contemporary Arabic Writing: Apply these models to modern Arabic texts, elevating their literary quality by infusing classical stylistic elements, making the language resonate with its classical roots.
# Credits
Credits go the Thaqalayn website for their compilation of Arabic and English texts. Also, the original webscrape is done by jenusi on GitHub in this repo. I only compiled it in the form of two columns for texts from all books. I also converted the numbers from western Arabic (0123456789) to eastern Arabic (٠١٢٣٤٥٦٧٨٩). | [
"# Introduction\n\nThis dataset represents a comprehensive collection of parallel Arabic-English texts from the Thaqalayn Hadith Library, a premier source for exploring the classical hadith tradition of the Imāmī Shia Muslim school of thought. The library focuses on making primary historical sources accessible, serving as a bridge between past wisdom and contemporary study. The dataset features translations of significant classical Imāmī hadith texts, allowing for a deep dive into the linguistic and cultural heritage of this era.",
"# Content Details\n\nThe Thaqalayn Hadith Library includes Arabic-English parallel texts from the following classical collections:\n\n- Al-Kāfi (The Sufficient)\n- Muʿjam al-Aḥādīth al-Muʿtabara (A Comprehensive Compilation of Reliable Narrations)\n- Al-Khiṣāl (The Book of Characteristics)\n- ʿUyūn akhbār al-Riḍā (The Source of Traditions on Imam al-Riḍā)\n- Al-Amālī (The Dictations) by Shaykh Muḥammad b. Muḥammad al-Mufīd\n- Al-Amālī (The Dictations) by Shaykh Muḥammad b. ʿAlī al-Ṣaduq\n- Al-Tawḥīd (The Book of Divine Unity)\n- Kitāb al-Ḍuʿafāʾ (The Weakened Ones)\n- Kitāb al-Ghayba (The Book of Occultation) by Abū ʿAbd Allah Muḥammad b. Ibrāhīm al-Nuʿmānī\n- Kitāb al-Ghayba (The Book of Occultation) by Shaykh Muḥammad b. al-Ḥasan al-Ṭūsī\n- Thawāb al-Aʿmāl wa ʿiqāb al-Aʿmāl (The Rewards & Punishments of Deeds)\n- Kāmil al-Ziyārāt (The Complete Pilgrimage Guide)\n- Faḍaʾil al-Shīʿa (Virtues of the Shīʿa)\n- Ṣifāt al-Shīʿa (Attributes of the Shīʿa)\n- Maʿānī al-ʾAkhbār (The Meanings of Reports)\n- Kitāb al-Muʾmin (The Book of the Believer)\n- Kitāb al-Zuhd (The Book of Asceticism)\n- Nahj al-Balāgha (The Peak of Eloquence)",
"# Purpose and Application\n\nThe dataset aims to showcase the unmatched literary quality of Classical Arabic, distinguished from Modern Standard Arabic, particularly in its preservation from the European translation trends of the 19th and 20th centuries:\n\n- Refinement of Machine Translation (MT): The complex grammatical structures and rich lexicon of Classical Arabic present a unique challenge for MT systems, pushing the boundaries of translation accuracy and fluency.\n- Development of Language Models: These texts serve as a foundation for training sophisticated Large Language Models (LLMs) capable of understanding and replicating the depth of Classical Arabic.\n- Preservation of Linguistic Heritage: This dataset preserves the original form of Classical Arabic, providing a standard of excellence against which modern writings can be compared.",
"# Suggested Research Application: Iterative Translation Refinement\n\nA notable application of this dataset is the enhancement of contemporary Arabic writing through back-translation. Existing models can back-translate English texts into Arabic, potentially producing a less sophisticated form. This offers an opportunity to:\n\n- Generate Imperfect Arabic Corpus: Use back-translation to create a corpus of Arabic text that is less refined than the original Classical Arabic.\n- Train Refinement Models: Develop models that refine the imperfect Arabic by comparing it to the original texts, aiming to restore the classical eloquence.\n- Enhance Contemporary Arabic Writing: Apply these models to modern Arabic texts, elevating their literary quality by infusing classical stylistic elements, making the language resonate with its classical roots.",
"# Credits\nCredits go the Thaqalayn website for their compilation of Arabic and English texts. Also, the original webscrape is done by jenusi on GitHub in this repo. I only compiled it in the form of two columns for texts from all books. I also converted the numbers from western Arabic (0123456789) to eastern Arabic (٠١٢٣٤٥٦٧٨٩)."
] | [
"TAGS\n#task_categories-translation #size_categories-10K<n<100K #language-Arabic #language-English #region-us \n",
"# Introduction\n\nThis dataset represents a comprehensive collection of parallel Arabic-English texts from the Thaqalayn Hadith Library, a premier source for exploring the classical hadith tradition of the Imāmī Shia Muslim school of thought. The library focuses on making primary historical sources accessible, serving as a bridge between past wisdom and contemporary study. The dataset features translations of significant classical Imāmī hadith texts, allowing for a deep dive into the linguistic and cultural heritage of this era.",
"# Content Details\n\nThe Thaqalayn Hadith Library includes Arabic-English parallel texts from the following classical collections:\n\n- Al-Kāfi (The Sufficient)\n- Muʿjam al-Aḥādīth al-Muʿtabara (A Comprehensive Compilation of Reliable Narrations)\n- Al-Khiṣāl (The Book of Characteristics)\n- ʿUyūn akhbār al-Riḍā (The Source of Traditions on Imam al-Riḍā)\n- Al-Amālī (The Dictations) by Shaykh Muḥammad b. Muḥammad al-Mufīd\n- Al-Amālī (The Dictations) by Shaykh Muḥammad b. ʿAlī al-Ṣaduq\n- Al-Tawḥīd (The Book of Divine Unity)\n- Kitāb al-Ḍuʿafāʾ (The Weakened Ones)\n- Kitāb al-Ghayba (The Book of Occultation) by Abū ʿAbd Allah Muḥammad b. Ibrāhīm al-Nuʿmānī\n- Kitāb al-Ghayba (The Book of Occultation) by Shaykh Muḥammad b. al-Ḥasan al-Ṭūsī\n- Thawāb al-Aʿmāl wa ʿiqāb al-Aʿmāl (The Rewards & Punishments of Deeds)\n- Kāmil al-Ziyārāt (The Complete Pilgrimage Guide)\n- Faḍaʾil al-Shīʿa (Virtues of the Shīʿa)\n- Ṣifāt al-Shīʿa (Attributes of the Shīʿa)\n- Maʿānī al-ʾAkhbār (The Meanings of Reports)\n- Kitāb al-Muʾmin (The Book of the Believer)\n- Kitāb al-Zuhd (The Book of Asceticism)\n- Nahj al-Balāgha (The Peak of Eloquence)",
"# Purpose and Application\n\nThe dataset aims to showcase the unmatched literary quality of Classical Arabic, distinguished from Modern Standard Arabic, particularly in its preservation from the European translation trends of the 19th and 20th centuries:\n\n- Refinement of Machine Translation (MT): The complex grammatical structures and rich lexicon of Classical Arabic present a unique challenge for MT systems, pushing the boundaries of translation accuracy and fluency.\n- Development of Language Models: These texts serve as a foundation for training sophisticated Large Language Models (LLMs) capable of understanding and replicating the depth of Classical Arabic.\n- Preservation of Linguistic Heritage: This dataset preserves the original form of Classical Arabic, providing a standard of excellence against which modern writings can be compared.",
"# Suggested Research Application: Iterative Translation Refinement\n\nA notable application of this dataset is the enhancement of contemporary Arabic writing through back-translation. Existing models can back-translate English texts into Arabic, potentially producing a less sophisticated form. This offers an opportunity to:\n\n- Generate Imperfect Arabic Corpus: Use back-translation to create a corpus of Arabic text that is less refined than the original Classical Arabic.\n- Train Refinement Models: Develop models that refine the imperfect Arabic by comparing it to the original texts, aiming to restore the classical eloquence.\n- Enhance Contemporary Arabic Writing: Apply these models to modern Arabic texts, elevating their literary quality by infusing classical stylistic elements, making the language resonate with its classical roots.",
"# Credits\nCredits go the Thaqalayn website for their compilation of Arabic and English texts. Also, the original webscrape is done by jenusi on GitHub in this repo. I only compiled it in the form of two columns for texts from all books. I also converted the numbers from western Arabic (0123456789) to eastern Arabic (٠١٢٣٤٥٦٧٨٩)."
] | [
36,
114,
460,
184,
190,
93
] | [
"passage: TAGS\n#task_categories-translation #size_categories-10K<n<100K #language-Arabic #language-English #region-us \n# Introduction\n\nThis dataset represents a comprehensive collection of parallel Arabic-English texts from the Thaqalayn Hadith Library, a premier source for exploring the classical hadith tradition of the Imāmī Shia Muslim school of thought. The library focuses on making primary historical sources accessible, serving as a bridge between past wisdom and contemporary study. The dataset features translations of significant classical Imāmī hadith texts, allowing for a deep dive into the linguistic and cultural heritage of this era."
] |
de7245bd3b1b534781e3ab4b60422b6f73526cf9 | Test https://aii.cx dataset AI tools
---
license: apache-2.0
---
Digital Marketers|🔍|SEO Trend Analysis|SEO|09 JUN 2024|Kevin Brown|SEO, Trends, Marketing|Essential for digital marketers focusing on web presence.|Discover how to analyze and capitalize on SEO trends with this prompt, enhancing your online visibility.|Intermediate
E-commerce Managers|💼|Customer Retention Strategies|E-commerce|12 JUL 2024|Rachel Green|Retention, Customer Service, Loyalty|Key for managers aiming to increase repeat business.|This prompt provides insights into effective strategies for retaining customers and ensuring long-term loyalty.|Intermediate
HR Recruiters|🤝|Effective Interviewing Techniques|Human Resources|28 AUG 2024|Monica Geller|Recruiting, Interviewing, Talent Acquisition|Crucial for HR professionals in all industries.|Learn to master interviewing techniques that identify the best candidates and fit for your team.|Beginner
Data Analysts|📈|Enhancing Data Visualization|Data Analysis|16 SEP 2024|Ross Geller|Visualization, Analytics, Insights|Important for professionals in data-heavy roles.|This prompt helps you enhance your data visualizations to convey complex information clearly and effectively.|Intermediate
Multimedia Artists|👓|Augmented Reality Projects|Creative Design|02 NOV 2024|Chandler Bing|AR, Technology, Innovation|Ideal for artists and designers in tech-forward fields.|Explore the cutting edge of augmented reality design and create immersive experiences with this prompt.|Advanced
Digital Marketers|💬|Social Media Engagement Plans|SMM|21 DEC 2024|Joey Tribbiani|Social Media, Engagement, Strategy|Vital for marketers focusing on digital communities.|Learn strategies to boost interaction and build a loyal community on various social media platforms.|Intermediate | blozixdextr/test | [
"region:us"
] | 2023-12-24T13:39:16+00:00 | {} | 2023-12-24T13:42:01+00:00 | [] | [] | TAGS
#region-us
| Test URL dataset AI tools
---
license: apache-2.0
---
Digital Marketers||SEO Trend Analysis|SEO|09 JUN 2024|Kevin Brown|SEO, Trends, Marketing|Essential for digital marketers focusing on web presence.|Discover how to analyze and capitalize on SEO trends with this prompt, enhancing your online visibility.|Intermediate
E-commerce Managers||Customer Retention Strategies|E-commerce|12 JUL 2024|Rachel Green|Retention, Customer Service, Loyalty|Key for managers aiming to increase repeat business.|This prompt provides insights into effective strategies for retaining customers and ensuring long-term loyalty.|Intermediate
HR Recruiters||Effective Interviewing Techniques|Human Resources|28 AUG 2024|Monica Geller|Recruiting, Interviewing, Talent Acquisition|Crucial for HR professionals in all industries.|Learn to master interviewing techniques that identify the best candidates and fit for your team.|Beginner
Data Analysts||Enhancing Data Visualization|Data Analysis|16 SEP 2024|Ross Geller|Visualization, Analytics, Insights|Important for professionals in data-heavy roles.|This prompt helps you enhance your data visualizations to convey complex information clearly and effectively.|Intermediate
Multimedia Artists||Augmented Reality Projects|Creative Design|02 NOV 2024|Chandler Bing|AR, Technology, Innovation|Ideal for artists and designers in tech-forward fields.|Explore the cutting edge of augmented reality design and create immersive experiences with this prompt.|Advanced
Digital Marketers||Social Media Engagement Plans|SMM|21 DEC 2024|Joey Tribbiani|Social Media, Engagement, Strategy|Vital for marketers focusing on digital communities.|Learn strategies to boost interaction and build a loyal community on various social media platforms.|Intermediate | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
c535542b607cf7f68610fb163876b6ff9a2ccb5f |
# Dataset Card for Evaluation run of l3utterfly/minima-3b-layla-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [l3utterfly/minima-3b-layla-v2](https://huggingface.co/l3utterfly/minima-3b-layla-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_l3utterfly__minima-3b-layla-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T14:17:53.842048](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__minima-3b-layla-v2/blob/main/results_2023-12-24T14-17-53.842048.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.29413179589868294,
"acc_stderr": 0.032120734757445236,
"acc_norm": 0.2950139868612686,
"acc_norm_stderr": 0.03284694916274657,
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834559,
"mc2": 0.43635958393568464,
"mc2_stderr": 0.014541344314280011
},
"harness|arc:challenge|25": {
"acc": 0.4138225255972696,
"acc_stderr": 0.014392730009221007,
"acc_norm": 0.44197952218430037,
"acc_norm_stderr": 0.014512682523128345
},
"harness|hellaswag|10": {
"acc": 0.522903804023103,
"acc_stderr": 0.004984543540932335,
"acc_norm": 0.6992630950009958,
"acc_norm_stderr": 0.0045764127139515
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.034554737023254366,
"acc_norm": 0.2,
"acc_norm_stderr": 0.034554737023254366
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.26973684210526316,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.26973684210526316,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.03745554791462457,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.03745554791462457
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2680851063829787,
"acc_stderr": 0.028957342788342343,
"acc_norm": 0.2680851063829787,
"acc_norm_stderr": 0.028957342788342343
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.20175438596491227,
"acc_stderr": 0.037752050135836386,
"acc_norm": 0.20175438596491227,
"acc_norm_stderr": 0.037752050135836386
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.35172413793103446,
"acc_stderr": 0.03979236637497411,
"acc_norm": 0.35172413793103446,
"acc_norm_stderr": 0.03979236637497411
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184763,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184763
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2870967741935484,
"acc_stderr": 0.025736542745594528,
"acc_norm": 0.2870967741935484,
"acc_norm_stderr": 0.025736542745594528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.17733990147783252,
"acc_stderr": 0.02687433727680835,
"acc_norm": 0.17733990147783252,
"acc_norm_stderr": 0.02687433727680835
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.035679697722680474,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.035679697722680474
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27979274611398963,
"acc_stderr": 0.03239637046735704,
"acc_norm": 0.27979274611398963,
"acc_norm_stderr": 0.03239637046735704
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2923076923076923,
"acc_stderr": 0.02306043838085774,
"acc_norm": 0.2923076923076923,
"acc_norm_stderr": 0.02306043838085774
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.02606715922227578,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.02606715922227578
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.28991596638655465,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.28991596638655465,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.27339449541284405,
"acc_stderr": 0.01910929984609829,
"acc_norm": 0.27339449541284405,
"acc_norm_stderr": 0.01910929984609829
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3101851851851852,
"acc_stderr": 0.031546962856566295,
"acc_norm": 0.3101851851851852,
"acc_norm_stderr": 0.031546962856566295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29957805907172996,
"acc_stderr": 0.029818024749753095,
"acc_norm": 0.29957805907172996,
"acc_norm_stderr": 0.029818024749753095
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.32286995515695066,
"acc_stderr": 0.03138147637575498,
"acc_norm": 0.32286995515695066,
"acc_norm_stderr": 0.03138147637575498
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3511450381679389,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.3511450381679389,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.0432704093257873,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.0432704093257873
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.358974358974359,
"acc_stderr": 0.031426169937919246,
"acc_norm": 0.358974358974359,
"acc_norm_stderr": 0.031426169937919246
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.29246487867177523,
"acc_stderr": 0.016267000684598642,
"acc_norm": 0.29246487867177523,
"acc_norm_stderr": 0.016267000684598642
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3092485549132948,
"acc_stderr": 0.02488314057007175,
"acc_norm": 0.3092485549132948,
"acc_norm_stderr": 0.02488314057007175
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29608938547486036,
"acc_stderr": 0.015268677317602255,
"acc_norm": 0.29608938547486036,
"acc_norm_stderr": 0.015268677317602255
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.02609016250427904,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.02609016250427904
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3536977491961415,
"acc_stderr": 0.02715520810320088,
"acc_norm": 0.3536977491961415,
"acc_norm_stderr": 0.02715520810320088
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23765432098765432,
"acc_stderr": 0.023683591837008553,
"acc_norm": 0.23765432098765432,
"acc_norm_stderr": 0.023683591837008553
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.025518731049537762,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.025518731049537762
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25554106910039115,
"acc_stderr": 0.011139857833598507,
"acc_norm": 0.25554106910039115,
"acc_norm_stderr": 0.011139857833598507
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3014705882352941,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.3014705882352941,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.28594771241830064,
"acc_stderr": 0.018280485072954673,
"acc_norm": 0.28594771241830064,
"acc_norm_stderr": 0.018280485072954673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3673469387755102,
"acc_stderr": 0.030862144921087565,
"acc_norm": 0.3673469387755102,
"acc_norm_stderr": 0.030862144921087565
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.32338308457711445,
"acc_stderr": 0.03307615947979035,
"acc_norm": 0.32338308457711445,
"acc_norm_stderr": 0.03307615947979035
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-virology|5": {
"acc": 0.26506024096385544,
"acc_stderr": 0.03436024037944967,
"acc_norm": 0.26506024096385544,
"acc_norm_stderr": 0.03436024037944967
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.34502923976608185,
"acc_stderr": 0.03645981377388807,
"acc_norm": 0.34502923976608185,
"acc_norm_stderr": 0.03645981377388807
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834559,
"mc2": 0.43635958393568464,
"mc2_stderr": 0.014541344314280011
},
"harness|winogrande|5": {
"acc": 0.654301499605367,
"acc_stderr": 0.013366596951934376
},
"harness|gsm8k|5": {
"acc": 0.08642911296436695,
"acc_stderr": 0.007740044337103801
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_l3utterfly__minima-3b-layla-v2 | [
"region:us"
] | 2023-12-24T14:20:14+00:00 | {"pretty_name": "Evaluation run of l3utterfly/minima-3b-layla-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [l3utterfly/minima-3b-layla-v2](https://huggingface.co/l3utterfly/minima-3b-layla-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_l3utterfly__minima-3b-layla-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-24T14:17:53.842048](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__minima-3b-layla-v2/blob/main/results_2023-12-24T14-17-53.842048.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.29413179589868294,\n \"acc_stderr\": 0.032120734757445236,\n \"acc_norm\": 0.2950139868612686,\n \"acc_norm_stderr\": 0.03284694916274657,\n \"mc1\": 0.26805385556915545,\n \"mc1_stderr\": 0.015506204722834559,\n \"mc2\": 0.43635958393568464,\n \"mc2_stderr\": 0.014541344314280011\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4138225255972696,\n \"acc_stderr\": 0.014392730009221007,\n \"acc_norm\": 0.44197952218430037,\n \"acc_norm_stderr\": 0.014512682523128345\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.522903804023103,\n \"acc_stderr\": 0.004984543540932335,\n \"acc_norm\": 0.6992630950009958,\n \"acc_norm_stderr\": 0.0045764127139515\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.034554737023254366,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.034554737023254366\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.26973684210526316,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.26973684210526316,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.03745554791462457,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.03745554791462457\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2680851063829787,\n \"acc_stderr\": 0.028957342788342343,\n \"acc_norm\": 0.2680851063829787,\n \"acc_norm_stderr\": 0.028957342788342343\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.20175438596491227,\n \"acc_stderr\": 0.037752050135836386,\n \"acc_norm\": 0.20175438596491227,\n \"acc_norm_stderr\": 0.037752050135836386\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.35172413793103446,\n \"acc_stderr\": 0.03979236637497411,\n \"acc_norm\": 0.35172413793103446,\n \"acc_norm_stderr\": 0.03979236637497411\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184763,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184763\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2870967741935484,\n \"acc_stderr\": 0.025736542745594528,\n \"acc_norm\": 0.2870967741935484,\n \"acc_norm_stderr\": 0.025736542745594528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.17733990147783252,\n \"acc_stderr\": 0.02687433727680835,\n \"acc_norm\": 0.17733990147783252,\n \"acc_norm_stderr\": 0.02687433727680835\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.035679697722680474,\n \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.035679697722680474\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.27979274611398963,\n \"acc_stderr\": 0.03239637046735704,\n \"acc_norm\": 0.27979274611398963,\n \"acc_norm_stderr\": 0.03239637046735704\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2923076923076923,\n \"acc_stderr\": 0.02306043838085774,\n \"acc_norm\": 0.2923076923076923,\n \"acc_norm_stderr\": 0.02306043838085774\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.02606715922227578,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.02606715922227578\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.28991596638655465,\n \"acc_stderr\": 0.029472485833136098,\n \"acc_norm\": 0.28991596638655465,\n \"acc_norm_stderr\": 0.029472485833136098\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.27339449541284405,\n \"acc_stderr\": 0.01910929984609829,\n \"acc_norm\": 0.27339449541284405,\n \"acc_norm_stderr\": 0.01910929984609829\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3101851851851852,\n \"acc_stderr\": 0.031546962856566295,\n \"acc_norm\": 0.3101851851851852,\n \"acc_norm_stderr\": 0.031546962856566295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.03058759135160425,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.03058759135160425\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.29957805907172996,\n \"acc_stderr\": 0.029818024749753095,\n \"acc_norm\": 0.29957805907172996,\n \"acc_norm_stderr\": 0.029818024749753095\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.32286995515695066,\n \"acc_stderr\": 0.03138147637575498,\n \"acc_norm\": 0.32286995515695066,\n \"acc_norm_stderr\": 0.03138147637575498\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.3511450381679389,\n \"acc_stderr\": 0.04186445163013751,\n \"acc_norm\": 0.3511450381679389,\n \"acc_norm_stderr\": 0.04186445163013751\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.358974358974359,\n \"acc_stderr\": 0.031426169937919246,\n \"acc_norm\": 0.358974358974359,\n \"acc_norm_stderr\": 0.031426169937919246\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29246487867177523,\n \"acc_stderr\": 0.016267000684598642,\n \"acc_norm\": 0.29246487867177523,\n \"acc_norm_stderr\": 0.016267000684598642\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3092485549132948,\n \"acc_stderr\": 0.02488314057007175,\n \"acc_norm\": 0.3092485549132948,\n \"acc_norm_stderr\": 0.02488314057007175\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29608938547486036,\n \"acc_stderr\": 0.015268677317602255,\n \"acc_norm\": 0.29608938547486036,\n \"acc_norm_stderr\": 0.015268677317602255\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02609016250427904,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02609016250427904\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3536977491961415,\n \"acc_stderr\": 0.02715520810320088,\n \"acc_norm\": 0.3536977491961415,\n \"acc_norm_stderr\": 0.02715520810320088\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.23765432098765432,\n \"acc_stderr\": 0.023683591837008553,\n \"acc_norm\": 0.23765432098765432,\n \"acc_norm_stderr\": 0.023683591837008553\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537762,\n \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537762\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25554106910039115,\n \"acc_stderr\": 0.011139857833598507,\n \"acc_norm\": 0.25554106910039115,\n \"acc_norm_stderr\": 0.011139857833598507\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.28594771241830064,\n \"acc_stderr\": 0.018280485072954673,\n \"acc_norm\": 0.28594771241830064,\n \"acc_norm_stderr\": 0.018280485072954673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3673469387755102,\n \"acc_stderr\": 0.030862144921087565,\n \"acc_norm\": 0.3673469387755102,\n \"acc_norm_stderr\": 0.030862144921087565\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.32338308457711445,\n \"acc_stderr\": 0.03307615947979035,\n \"acc_norm\": 0.32338308457711445,\n \"acc_norm_stderr\": 0.03307615947979035\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.26506024096385544,\n \"acc_stderr\": 0.03436024037944967,\n \"acc_norm\": 0.26506024096385544,\n \"acc_norm_stderr\": 0.03436024037944967\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.34502923976608185,\n \"acc_stderr\": 0.03645981377388807,\n \"acc_norm\": 0.34502923976608185,\n \"acc_norm_stderr\": 0.03645981377388807\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26805385556915545,\n \"mc1_stderr\": 0.015506204722834559,\n \"mc2\": 0.43635958393568464,\n \"mc2_stderr\": 0.014541344314280011\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.654301499605367,\n \"acc_stderr\": 0.013366596951934376\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08642911296436695,\n \"acc_stderr\": 0.007740044337103801\n }\n}\n```", "repo_url": "https://huggingface.co/l3utterfly/minima-3b-layla-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|arc:challenge|25_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|gsm8k|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hellaswag|10_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T14-17-53.842048.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["**/details_harness|winogrande|5_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-24T14-17-53.842048.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_24T14_17_53.842048", "path": ["results_2023-12-24T14-17-53.842048.parquet"]}, {"split": "latest", "path": ["results_2023-12-24T14-17-53.842048.parquet"]}]}]} | 2023-12-24T14:20:41+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of l3utterfly/minima-3b-layla-v2
Dataset automatically created during the evaluation run of model l3utterfly/minima-3b-layla-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-24T14:17:53.842048(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of l3utterfly/minima-3b-layla-v2\n\n\n\nDataset automatically created during the evaluation run of model l3utterfly/minima-3b-layla-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T14:17:53.842048(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of l3utterfly/minima-3b-layla-v2\n\n\n\nDataset automatically created during the evaluation run of model l3utterfly/minima-3b-layla-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T14:17:53.842048(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of l3utterfly/minima-3b-layla-v2\n\n\n\nDataset automatically created during the evaluation run of model l3utterfly/minima-3b-layla-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-24T14:17:53.842048(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
6bc13568e9a2d809a655b11354eef07f2b2f7bf8 |
# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [EmbeddedLLM/Mistral-7B-Merge-14-v0.1](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T14:37:13.200046](https://huggingface.co/datasets/open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.1/blob/main/results_2023-12-24T14-37-13.200046.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6573576723388176,
"acc_stderr": 0.031870045157462945,
"acc_norm": 0.6576373888298446,
"acc_norm_stderr": 0.032519760947002464,
"mc1": 0.4614443084455324,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.6343069279156973,
"mc2_stderr": 0.015373751500509238
},
"harness|arc:challenge|25": {
"acc": 0.659556313993174,
"acc_stderr": 0.013847460518892976,
"acc_norm": 0.6911262798634812,
"acc_norm_stderr": 0.013501770929344003
},
"harness|hellaswag|10": {
"acc": 0.698864767974507,
"acc_stderr": 0.0045781379492981725,
"acc_norm": 0.8669587731527584,
"acc_norm_stderr": 0.003389251991438502
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268542,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268542
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.02380763319865727,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.02380763319865727
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.029443169323031537,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.029443169323031537
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323797,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323797
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4435754189944134,
"acc_stderr": 0.01661568040100372,
"acc_norm": 0.4435754189944134,
"acc_norm_stderr": 0.01661568040100372
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.02378858355165854,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.02378858355165854
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4745762711864407,
"acc_stderr": 0.012753716929101003,
"acc_norm": 0.4745762711864407,
"acc_norm_stderr": 0.012753716929101003
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.018718067052623227,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.018718067052623227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4614443084455324,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.6343069279156973,
"mc2_stderr": 0.015373751500509238
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.011201862744487055
},
"harness|gsm8k|5": {
"acc": 0.6959818043972706,
"acc_stderr": 0.012670420440198667
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.1 | [
"region:us"
] | 2023-12-24T14:39:29+00:00 | {"pretty_name": "Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [EmbeddedLLM/Mistral-7B-Merge-14-v0.1](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-24T14:37:13.200046](https://huggingface.co/datasets/open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.1/blob/main/results_2023-12-24T14-37-13.200046.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6573576723388176,\n \"acc_stderr\": 0.031870045157462945,\n \"acc_norm\": 0.6576373888298446,\n \"acc_norm_stderr\": 0.032519760947002464,\n \"mc1\": 0.4614443084455324,\n \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.6343069279156973,\n \"mc2_stderr\": 0.015373751500509238\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.659556313993174,\n \"acc_stderr\": 0.013847460518892976,\n \"acc_norm\": 0.6911262798634812,\n \"acc_norm_stderr\": 0.013501770929344003\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.698864767974507,\n \"acc_stderr\": 0.0045781379492981725,\n \"acc_norm\": 0.8669587731527584,\n \"acc_norm_stderr\": 0.003389251991438502\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268542,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268542\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.02380763319865727,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.02380763319865727\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.029443169323031537,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.029443169323031537\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n \"acc_stderr\": 0.013265346261323797,\n \"acc_norm\": 0.8352490421455939,\n \"acc_norm_stderr\": 0.013265346261323797\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4435754189944134,\n \"acc_stderr\": 0.01661568040100372,\n \"acc_norm\": 0.4435754189944134,\n \"acc_norm_stderr\": 0.01661568040100372\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.02378858355165854,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.02378858355165854\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4745762711864407,\n \"acc_stderr\": 0.012753716929101003,\n \"acc_norm\": 0.4745762711864407,\n \"acc_norm_stderr\": 0.012753716929101003\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.018718067052623227,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.018718067052623227\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4614443084455324,\n \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.6343069279156973,\n \"mc2_stderr\": 0.015373751500509238\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.011201862744487055\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6959818043972706,\n \"acc_stderr\": 0.012670420440198667\n }\n}\n```", "repo_url": "https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|arc:challenge|25_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|gsm8k|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hellaswag|10_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T14-37-13.200046.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["**/details_harness|winogrande|5_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-24T14-37-13.200046.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_24T14_37_13.200046", "path": ["results_2023-12-24T14-37-13.200046.parquet"]}, {"split": "latest", "path": ["results_2023-12-24T14-37-13.200046.parquet"]}]}]} | 2023-12-24T14:39:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.1
Dataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-14-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-24T14:37:13.200046(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.1\n\n\n\nDataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-14-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T14:37:13.200046(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.1\n\n\n\nDataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-14-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T14:37:13.200046(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.1\n\n\n\nDataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-14-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-24T14:37:13.200046(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
258ad9b01e1d5f285a8e162d8bf2b74d3d97597b |
## Dataset Summary
NER-News-BIDataset is a dataset for named entity recognition (NER) in news articles, publicly released by the National Institute of Korean Language in 2023.
The dataset is labeled with named entities specifically for news data.
It consists of a total of 150,142 sentences, and entities are categorized into 150 labels for recognition.
## Languages
Korean
## Data Structure
DatasetDict({
train: Dataset({
features: ['input_ids', 'attention_mask', 'labels'],
num_rows: 120113
})
test: Dataset({
features: ['input_ids', 'attention_mask', 'labels'],
num_rows: 30029
})
})
### Data Instances
The dataset is provided in text format with train/test sets.
Each instance represents a news article, and if there is an entity in the sentence, it is appropriately tagged with the corresponding label.
In cases where a single entity is separated into multiple tokens, the first token is labeled as "B-entity" and the subsequent tokens are labeled as "I-entity" until the end.
### Data Fields
input_ids: "A processed named entity corpus of news articles constructed in 2022" has been tokenized and represented with numerical values.
label: Identified a total of 151 entities, including the 0th label (not an entity). If counting both "B-entity" and "I-entity" labels for each entity, there are a total of 301 labels.
The labeling is done with numerical values.
The 151 types of labels are as follows:
|index|0|1|2|3|4|5|6|7|8|9|10|11|12|13|14|15|16|17|18|19|20|21|22|23|24|25|26|27|28|29|30|31|32|33|34|35|36|37|38|39|40|41|42|43|44|45|46|47|48|49|50|51|52|53|54|55|56|57|58|59|60|61|62|63|64|65|66|67|68|69|70|71|72|73|74|75|76|77|78|79|80|81|82|83|84|85|86|87|88|89|90|91|92|93|94|95|96|97|98|99|100|101|102|103|104|105|106|107|108|109|110|111|112|113|114|115|116|117|118|119|120|121|122|123|124|125|126|127|128|129|130|131|132|133|134|135|136|137|138|139|140|141|142|143|144|145|146|147|148|149|150|151|152|153|154|155|156|157|158|159|160|161|162|163|164|165|166|167|168|169|170|171|172|173|174|175|176|177|178|179|180|181|182|183|184|185|186|187|188|189|190|191|192|193|194|195|196|197|198|199|200|201|202|203|204|205|206|207|208|209|210|211|212|213|214|215|216|217|218|219|220|221|222|223|224|225|226|227|228|229|230|231|232|233|234|235|236|237|238|239|240|241|242|243|244|245|246|247|248|249|250|251|252|253|254|255|256|257|258|259|260|261|262|263|264|265|266|267|268|269|270|271|272|273|274|275|276|277|278|279|280|281|282|283|284|285|286|287|288|289|290|291|292|293|294|295|296|297|298|299|300|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|Label|O|B-PS\_NAME|B-PS\_CHARACTER|B-PS\_PET|B-FD\_SCIENCE|B-FD\_SOCIAL\_SCIENCE|B-FD\_MEDICINE|B-FD\_ART|B-FD\_HUMANITIES|B-FD\_OTHERS|B-TR\_SCIENCE|B-TR\_SOCIAL\_SCIENCE|B-TR\_MEDICINE|B-TR\_ART|B-TR\_HUMANITIES|B-TR\_OTHERS|B-AF\_BUILDING|B-AF\_CULTURAL\_ASSET|B-AF\_ROAD|B-AF\_TRANSPORT|B-AF\_MUSICAL\_INSTRUMENT|B-AF\_WEAPON|B-AFA\_DOCUMENT|B-AFA\_PERFORMANCE|B-AFA\_VIDEO|B-AFA\_ART\_CRAFT|B-AFA\_MUSIC|B-AFW\_SERVICE\_PRODUCTS|B-AFW\_OTHER\_PRODUCTS|B-OGG\_ECONOMY|B-OGG\_EDUCATION|B-OGG\_MILITARY|B-OGG\_MEDIA|B-OGG\_SPORTS|B-OGG\_ART|B-OGG\_MEDICINE|B-OGG\_RELIGION|B-OGG\_SCIENCE|B-OGG\_LIBRARY|B-OGG\_LAW|B-OGG\_POLITICS|B-OGG\_FOOD|B-OGG\_HOTEL|B-OGG\_OTHERS|B-LCP\_COUNTRY|B-LCP\_PROVINCE|B-LCP\_COUNTY|B-LCP\_CITY|B-LCP\_CAPITALCITY|B-LCG\_RIVER|B-LCG\_OCEAN|B-LCG\_BAY|B-LCG\_MOUNTAIN|B-LCG\_ISLAND|B-LCG\_CONTINENT|B-LC\_SPACE|B-LC\_OTHERS|B-CV\_CULTURE|B-CV\_TRIBE|B-CV\_LANGUAGE|B-CV\_POLICY|B-CV\_LAW|B-CV\_CURRENCY|B-CV\_TAX|B-CV\_FUNDS|B-CV\_ART|B-CV\_SPORTS|B-CV\_SPORTS\_POSITION|B-CV\_SPORTS\_INST|B-CV\_PRIZE|B-CV\_RELATION|B-CV\_OCCUPATION|B-CV\_POSITION|B-CV\_FOOD|B-CV\_DRINK|B-CV\_FOOD\_STYLE|B-CV\_CLOTHING|B-CV\_BUILDING\_TYPE|B-DT\_DURATION|B-DT\_DAY|B-DT\_WEEK|B-DT\_MONTH|B-DT\_YEAR|B-DT\_SEASON|B-DT\_GEOAGE|B-DT\_DYNASTY|B-DT\_OTHERS|B-TI\_DURATION|B-TI\_HOUR|B-TI\_MINUTE|B-TI\_SECOND|B-TI\_OTHERS|B-QT\_AGE|B-QT\_SIZE|B-QT\_LENGTH|B-QT\_COUNT|B-QT\_MAN\_COUNT|B-QT\_WEIGHT|B-QT\_PERCENTAGE|B-QT\_SPEED|B-QT\_TEMPERATURE|B-QT\_VOLUME|B-QT\_ORDER|B-QT\_PRICE|B-QT\_PHONE|B-QT\_SPORTS|B-QT\_CHANNEL|B-QT\_ALBUM|B-QT\_ADDRESS|B-QT\_OTHERS|B-EV\_ACTIVITY|B-EV\_WAR\_REVOLUTION|B-EV\_SPORTS|B-EV\_FESTIVAL|B-EV\_OTHERS|B-AM\_INSECT|B-AM\_BIRD|B-AM\_FISH|B-AM\_MAMMALIA|B-AM\_AMPHIBIA|B-AM\_REPTILIA|B-AM\_TYPE|B-AM\_PART|B-AM\_OTHERS|B-PT\_FRUIT|B-PT\_FLOWER|B-PT\_TREE|B-PT\_GRASS|B-PT\_TYPE|B-PT\_PART|B-PT\_OTHERS|B-MT\_ELEMENT|B-MT\_METAL|B-MT\_ROCK|B-MT\_CHEMICAL|B-TM\_COLOR|B-TM\_DIRECTION|B-TM\_CLIMATE|B-TM\_SHAPE|B-TM\_CELL\_TISSUE\_ORGAN|B-TMM\_DISEASE|B-TMM\_DRUG|B-TMI\_HW|B-TMI\_SW|B-TMI\_SITE|B-TMI\_EMAIL|B-TMI\_MODEL|B-TMI\_SERVICE|B-TMI\_PROJECT|B-TMIG\_GENRE|B-TM\_SPORTS|I-PS\_NAME|I-PS\_CHARACTER|I-PS\_PET|I-FD\_SCIENCE|I-FD\_SOCIAL\_SCIENCE|I-FD\_MEDICINE|I-FD\_ART|I-FD\_HUMANITIES|I-FD\_OTHERS|I-TR\_SCIENCE|I-TR\_SOCIAL\_SCIENCE|I-TR\_MEDICINE|I-TR\_ART|I-TR\_HUMANITIES|I-TR\_OTHERS|I-AF\_BUILDING|I-AF\_CULTURAL\_ASSET|I-AF\_ROAD|I-AF\_TRANSPORT|I-AF\_MUSICAL\_INSTRUMENT|I-AF\_WEAPON|I-AFA\_DOCUMENT|I-AFA\_PERFORMANCE|I-AFA\_VIDEO|I-AFA\_ART\_CRAFT|I-AFA\_MUSIC|I-AFW\_SERVICE\_PRODUCTS|I-AFW\_OTHER\_PRODUCTS|I-OGG\_ECONOMY|I-OGG\_EDUCATION|I-OGG\_MILITARY|I-OGG\_MEDIA|I-OGG\_SPORTS|I-OGG\_ART|I-OGG\_MEDICINE|I-OGG\_RELIGION|I-OGG\_SCIENCE|I-OGG\_LIBRARY|I-OGG\_LAW|I-OGG\_POLITICS|I-OGG\_FOOD|I-OGG\_HOTEL|I-OGG\_OTHERS|I-LCP\_COUNTRY|I-LCP\_PROVINCE|I-LCP\_COUNTY|I-LCP\_CITY|I-LCP\_CAPITALCITY|I-LCG\_RIVER|I-LCG\_OCEAN|I-LCG\_BAY|I-LCG\_MOUNTAIN|I-LCG\_ISLAND|I-LCG\_CONTINENT|I-LC\_SPACE|I-LC\_OTHERS|I-CV\_CULTURE|I-CV\_TRIBE|I-CV\_LANGUAGE|I-CV\_POLICY|I-CV\_LAW|I-CV\_CURRENCY|I-CV\_TAX|I-CV\_FUNDS|I-CV\_ART|I-CV\_SPORTS|I-CV\_SPORTS\_POSITION|I-CV\_SPORTS\_INST|I-CV\_PRIZE|I-CV\_RELATION|I-CV\_OCCUPATION|I-CV\_POSITION|I-CV\_FOOD|I-CV\_DRINK|I-CV\_FOOD\_STYLE|I-CV\_CLOTHING|I-CV\_BUILDING\_TYPE|I-DT\_DURATION|I-DT\_DAY|I-DT\_WEEK|I-DT\_MONTH|I-DT\_YEAR|I-DT\_SEASON|I-DT\_GEOAGE|I-DT\_DYNASTY|I-DT\_OTHERS|I-TI\_DURATION|I-TI\_HOUR|I-TI\_MINUTE|I-TI\_SECOND|I-TI\_OTHERS|I-QT\_AGE|I-QT\_SIZE|I-QT\_LENGTH|I-QT\_COUNT|I-QT\_MAN\_COUNT|I-QT\_WEIGHT|I-QT\_PERCENTAGE|I-QT\_SPEED|I-QT\_TEMPERATURE|I-QT\_VOLUME|I-QT\_ORDER|I-QT\_PRICE|I-QT\_PHONE|I-QT\_SPORTS|I-QT\_CHANNEL|I-QT\_ALBUM|I-QT\_ADDRESS|I-QT\_OTHERS|I-EV\_ACTIVITY|I-EV\_WAR\_REVOLUTION|I-EV\_SPORTS|I-EV\_FESTIVAL|I-EV\_OTHERS|I-AM\_INSECT|I-AM\_BIRD|I-AM\_FISH|I-AM\_MAMMALIA|I-AM\_AMPHIBIA|I-AM\_REPTILIA|I-AM\_TYPE|I-AM\_PART|I-AM\_OTHERS|I-PT\_FRUIT|I-PT\_FLOWER|I-PT\_TREE|I-PT\_GRASS|I-PT\_TYPE|I-PT\_PART|I-PT\_OTHERS|I-MT\_ELEMENT|I-MT\_METAL|I-MT\_ROCK|I-MT\_CHEMICAL|I-TM\_COLOR|I-TM\_DIRECTION|I-TM\_CLIMATE|I-TM\_SHAPE|I-TM\_CELL\_TISSUE\_ORGAN|I-TMM\_DISEASE|I-TMM\_DRUG|I-TMI\_HW|I-TMI\_SW|I-TMI\_SITE|I-TMI\_EMAIL|I-TMI\_MODEL|I-TMI\_SERVICE|I-TMI\_PROJECT|I-TMIG\_GENRE|I-TM\_SPORTS|
|Number|0|1|2|3|4|5|6|7|8|9|10|11|12|13|14|15|16|17|18|19|20|21|22|23|24|25|26|27|28|29|30|31|32|33|34|35|36|37|38|39|40|41|42|43|44|45|46|47|48|49|50|51|52|53|54|55|56|57|58|59|60|61|62|63|64|65|66|67|68|69|70|71|72|73|74|75|76|77|78|79|80|81|82|83|84|85|86|87|88|89|90|91|92|93|94|95|96|97|98|99|100|101|102|103|104|105|106|107|108|109|110|111|112|113|114|115|116|117|118|119|120|121|122|123|124|125|126|127|128|129|130|131|132|133|134|135|136|137|138|139|140|141|142|143|144|145|146|147|148|149|150|151|152|153|154|155|156|157|158|159|160|161|162|163|164|165|166|167|168|169|170|171|172|173|174|175|176|177|178|179|180|181|182|183|184|185|186|187|188|189|190|191|192|193|194|195|196|197|198|199|200|201|202|203|204|205|206|207|208|209|210|211|212|213|214|215|216|217|218|219|220|221|222|223|224|225|226|227|228|229|230|231|232|233|234|235|236|237|238|239|240|241|242|243|244|245|246|247|248|249|250|251|252|253|254|255|256|257|258|259|260|261|262|263|264|265|266|267|268|269|270|271|272|273|274|275|276|277|278|279|280|281|282|283|284|285|286|287|288|289|290|291|292|293|294|295|296|297|298|299|300|
Frequency Statistics
|index|0|1|2|3|4|5|6|7|8|9|10|11|12|13|14|15|16|17|18|19|20|21|22|23|24|25|26|27|28|29|30|31|32|33|34|35|36|37|38|39|40|41|42|43|44|45|46|47|48|49|50|51|52|53|54|55|56|57|58|59|60|61|62|63|64|65|66|67|68|69|70|71|72|73|74|75|76|77|78|79|80|81|82|83|84|85|86|87|88|89|90|91|92|93|94|95|96|97|98|99|100|101|102|103|104|105|106|107|108|109|110|111|112|113|114|115|116|117|118|119|120|121|122|123|124|125|126|127|128|129|130|131|132|133|134|135|136|137|138|139|140|141|142|143|144|145|146|147|148|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|Label|OGG\_POLITICS|CV\_POSITION|PS\_NAME|QT\_COUNT|LCP\_CITY|DT\_DAY|DT\_YEAR|LCP\_COUNTY|QT\_ORDER|DT\_OTHERS|TMM\_DISEASE|QT\_PRICE|QT\_MAN\_COUNT|DT\_DURATION|CV\_OCCUPATION|LC\_OTHERS|OGG\_ECONOMY|QT\_PERCENTAGE|OGG\_OTHERS|TMI\_PROJECT|LCP\_PROVINCE|AF\_TRANSPORT|OGG\_EDUCATION|LCP\_COUNTRY|EV\_OTHERS|AF\_BUILDING|CV\_LAW|TMI\_HW|OGG\_SPORTS|DT\_MONTH|CV\_RELATION|CV\_POLICY|CV\_FOOD|TI\_DURATION|TMI\_SERVICE|OGG\_MEDICINE|QT\_AGE|QT\_SIZE|AF\_ROAD|EV\_FESTIVAL|AM\_PART|EV\_SPORTS|CV\_PRIZE|TR\_SCIENCE|TM\_DIRECTION|OGG\_ART|QT\_OTHERS|PT\_GRASS|QT\_LENGTH|MT\_CHEMICAL|OGG\_SCIENCE|PT\_FRUIT|LCP\_CAPITALCITY|CV\_SPORTS|TMM\_DRUG|CV\_ART|LCG\_RIVER|AF\_CULTURAL\_ASSET|TM\_CELL\_TISSUE\_ORGAN|OGG\_RELIGION|QT\_SPORTS|QT\_WEIGHT|DT\_SEASON|AFA\_DOCUMENT|OGG\_MEDIA|TI\_OTHERS|TI\_HOUR|OGG\_MILITARY|LCG\_ISLAND|CV\_DRINK|LCG\_MOUNTAIN|CV\_TAX|CV\_FUNDS|TR\_MEDICINE|AFA\_VIDEO|AM\_MAMMALIA|OGG\_FOOD|MT\_ELEMENT|TM\_SPORTS|AM\_OTHERS|LCG\_CONTINENT|PT\_PART|OGG\_LAW|AFW\_OTHER\_PRODUCTS|CV\_CULTURE|AFW\_SERVICE\_PRODUCTS|CV\_CLOTHING|DT\_DYNASTY|FD\_MEDICINE|PT\_FLOWER|CV\_TRIBE|PT\_TREE|FD\_SCIENCE|TM\_COLOR|AM\_BIRD|QT\_ADDRESS|QT\_PHONE|CV\_LANGUAGE|TR\_SOCIAL\_SCIENCE|EV\_ACTIVITY|EV\_WAR\_REVOLUTION|CV\_SPORTS\_POSITION|OGG\_LIBRARY|AM\_TYPE|TMI\_SW|AFA\_MUSIC|DT\_WEEK|AFA\_PERFORMANCE|AFA\_ART\_CRAFT|FD\_HUMANITIES|QT\_VOLUME|TMI\_SITE|OGG\_HOTEL|LCG\_BAY|PS\_CHARACTER|LCG\_OCEAN|AM\_INSECT|AM\_FISH|QT\_TEMPERATURE|PT\_OTHERS|TM\_SHAPE|MT\_METAL|MT\_ROCK|AF\_MUSICAL\_INSTRUMENT|PT\_TYPE|QT\_SPEED|AF\_WEAPON|CV\_FOOD\_STYLE|LC\_SPACE|FD\_SOCIAL\_SCIENCE|CV\_SPORTS\_INST|TR\_ART|FD\_OTHERS|AM\_AMPHIBIA|AM\_REPTILIA|TMIG\_GENRE|TR\_OTHERS|TMI\_EMAIL|CV\_BUILDING\_TYPE|PS\_PET|TR\_HUMANITIES|DT\_GEOAGE|FD\_ART|CV\_CURRENCY|TMI\_MODEL|TI\_SECOND|QT\_CHANNEL|TM\_CLIMATE|TI\_MINUTE|
|Frequency|69683|43695|42060|30949|24791|19994|19836|19376|17908|17768|17622|15686|15460|15385|13634|13473|12744|12129|9912|9249|9084|8689|7475|7378|6144|5193|4875|4458|4440|4360|4002|3944|3537|3277|2993|2803|2659|2523|2465|2407|2401|2400|2231|2145|1999|1914|1911|1617|1615|1602|1589|1515|1395|1322|1307|1289|1258|1244|1165|1157|1145|1110|1097|987|980|979|976|967|937|884|869|859|857|855|837|775|752|720|715|689|688|683|667|631|583|505|467|453|445|441|437|410|395|391|391|388|383|370|367|367|362|337|304|296|285|283|275|273|265|245|240|229|222|220|220|204|192|191|188|158|151|149|148|130|126|124|113|110|107|82|52|43|42|41|40|37|35|34|30|25|22|19|19|11|8|8|5|3|2|
### Data Splits
The dataset, consisting of 150,142 sentences, has been split in a ratio of 8:2. There are 120,113 sentences in the training set and 3,029 sentences in the test set.
## Source Data
This dataset is based on the 'National Institute of Korean Language Named Entity Analysis Corpus 2022 (Version 1.1)' released by the National Institute of Korean Language in September 2023.
For more detailed information, please refer to the National Institute of Korean Language website > Resources > Research Materials > '2022 Corpus Named Entity Analysis and Entity Linking' project report.
### Citation
(국문) 국립국어원(2023). 국립국어원 개체명 분석 말뭉치 2022(버전 1.1) URL: https://corpus.korean.go.kr
(Eng) National Institute of Korean Language(2023). NIKL Named Entity Corpus 2022 (v.1.1) URL: https://corpus.korean.go.kr | yeajinmin/NER-News-BIDataset | [
"task_categories:token-classification",
"size_categories:100K<n<1M",
"language:ko",
"region:us"
] | 2023-12-24T15:20:10+00:00 | {"language": ["ko"], "size_categories": ["100K<n<1M"], "task_categories": ["token-classification"], "dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 76440290.15811698, "num_examples": 120113}, {"name": "test", "num_bytes": 19110549.84188302, "num_examples": 30029}], "download_size": 16997872, "dataset_size": 95550840}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-07T07:22:35+00:00 | [] | [
"ko"
] | TAGS
#task_categories-token-classification #size_categories-100K<n<1M #language-Korean #region-us
| Dataset Summary
---------------
NER-News-BIDataset is a dataset for named entity recognition (NER) in news articles, publicly released by the National Institute of Korean Language in 2023.
The dataset is labeled with named entities specifically for news data.
It consists of a total of 150,142 sentences, and entities are categorized into 150 labels for recognition.
Languages
---------
Korean
Data Structure
--------------
DatasetDict({
train: Dataset({
features: ['input\_ids', 'attention\_mask', 'labels'],
num\_rows: 120113
})
test: Dataset({
features: ['input\_ids', 'attention\_mask', 'labels'],
num\_rows: 30029
})
})
### Data Instances
The dataset is provided in text format with train/test sets.
Each instance represents a news article, and if there is an entity in the sentence, it is appropriately tagged with the corresponding label.
In cases where a single entity is separated into multiple tokens, the first token is labeled as "B-entity" and the subsequent tokens are labeled as "I-entity" until the end.
### Data Fields
input\_ids: "A processed named entity corpus of news articles constructed in 2022" has been tokenized and represented with numerical values.
label: Identified a total of 151 entities, including the 0th label (not an entity). If counting both "B-entity" and "I-entity" labels for each entity, there are a total of 301 labels.
The labeling is done with numerical values.
The 151 types of labels are as follows:
Frequency Statistics
### Data Splits
The dataset, consisting of 150,142 sentences, has been split in a ratio of 8:2. There are 120,113 sentences in the training set and 3,029 sentences in the test set.
Source Data
-----------
This dataset is based on the 'National Institute of Korean Language Named Entity Analysis Corpus 2022 (Version 1.1)' released by the National Institute of Korean Language in September 2023.
For more detailed information, please refer to the National Institute of Korean Language website > Resources > Research Materials > '2022 Corpus Named Entity Analysis and Entity Linking' project report.
(국문) 국립국어원(2023). 국립국어원 개체명 분석 말뭉치 2022(버전 1.1) URL: URL
(Eng) National Institute of Korean Language(2023). NIKL Named Entity Corpus 2022 (v.1.1) URL: URL
| [
"### Data Instances\n\n\nThe dataset is provided in text format with train/test sets. \n\nEach instance represents a news article, and if there is an entity in the sentence, it is appropriately tagged with the corresponding label. \n\nIn cases where a single entity is separated into multiple tokens, the first token is labeled as \"B-entity\" and the subsequent tokens are labeled as \"I-entity\" until the end.",
"### Data Fields\n\n\ninput\\_ids: \"A processed named entity corpus of news articles constructed in 2022\" has been tokenized and represented with numerical values. \n\nlabel: Identified a total of 151 entities, including the 0th label (not an entity). If counting both \"B-entity\" and \"I-entity\" labels for each entity, there are a total of 301 labels.\nThe labeling is done with numerical values. \n\nThe 151 types of labels are as follows:\n\n\n\nFrequency Statistics",
"### Data Splits\n\n\nThe dataset, consisting of 150,142 sentences, has been split in a ratio of 8:2. There are 120,113 sentences in the training set and 3,029 sentences in the test set.\n\n\nSource Data\n-----------\n\n\nThis dataset is based on the 'National Institute of Korean Language Named Entity Analysis Corpus 2022 (Version 1.1)' released by the National Institute of Korean Language in September 2023. \n\nFor more detailed information, please refer to the National Institute of Korean Language website > Resources > Research Materials > '2022 Corpus Named Entity Analysis and Entity Linking' project report.\n\n\n(국문) 국립국어원(2023). 국립국어원 개체명 분석 말뭉치 2022(버전 1.1) URL: URL \n\n(Eng) National Institute of Korean Language(2023). NIKL Named Entity Corpus 2022 (v.1.1) URL: URL"
] | [
"TAGS\n#task_categories-token-classification #size_categories-100K<n<1M #language-Korean #region-us \n",
"### Data Instances\n\n\nThe dataset is provided in text format with train/test sets. \n\nEach instance represents a news article, and if there is an entity in the sentence, it is appropriately tagged with the corresponding label. \n\nIn cases where a single entity is separated into multiple tokens, the first token is labeled as \"B-entity\" and the subsequent tokens are labeled as \"I-entity\" until the end.",
"### Data Fields\n\n\ninput\\_ids: \"A processed named entity corpus of news articles constructed in 2022\" has been tokenized and represented with numerical values. \n\nlabel: Identified a total of 151 entities, including the 0th label (not an entity). If counting both \"B-entity\" and \"I-entity\" labels for each entity, there are a total of 301 labels.\nThe labeling is done with numerical values. \n\nThe 151 types of labels are as follows:\n\n\n\nFrequency Statistics",
"### Data Splits\n\n\nThe dataset, consisting of 150,142 sentences, has been split in a ratio of 8:2. There are 120,113 sentences in the training set and 3,029 sentences in the test set.\n\n\nSource Data\n-----------\n\n\nThis dataset is based on the 'National Institute of Korean Language Named Entity Analysis Corpus 2022 (Version 1.1)' released by the National Institute of Korean Language in September 2023. \n\nFor more detailed information, please refer to the National Institute of Korean Language website > Resources > Research Materials > '2022 Corpus Named Entity Analysis and Entity Linking' project report.\n\n\n(국문) 국립국어원(2023). 국립국어원 개체명 분석 말뭉치 2022(버전 1.1) URL: URL \n\n(Eng) National Institute of Korean Language(2023). NIKL Named Entity Corpus 2022 (v.1.1) URL: URL"
] | [
35,
99,
121,
190
] | [
"passage: TAGS\n#task_categories-token-classification #size_categories-100K<n<1M #language-Korean #region-us \n### Data Instances\n\n\nThe dataset is provided in text format with train/test sets. \n\nEach instance represents a news article, and if there is an entity in the sentence, it is appropriately tagged with the corresponding label. \n\nIn cases where a single entity is separated into multiple tokens, the first token is labeled as \"B-entity\" and the subsequent tokens are labeled as \"I-entity\" until the end.### Data Fields\n\n\ninput\\_ids: \"A processed named entity corpus of news articles constructed in 2022\" has been tokenized and represented with numerical values. \n\nlabel: Identified a total of 151 entities, including the 0th label (not an entity). If counting both \"B-entity\" and \"I-entity\" labels for each entity, there are a total of 301 labels.\nThe labeling is done with numerical values. \n\nThe 151 types of labels are as follows:\n\n\n\nFrequency Statistics### Data Splits\n\n\nThe dataset, consisting of 150,142 sentences, has been split in a ratio of 8:2. There are 120,113 sentences in the training set and 3,029 sentences in the test set.\n\n\nSource Data\n-----------\n\n\nThis dataset is based on the 'National Institute of Korean Language Named Entity Analysis Corpus 2022 (Version 1.1)' released by the National Institute of Korean Language in September 2023. \n\nFor more detailed information, please refer to the National Institute of Korean Language website > Resources > Research Materials > '2022 Corpus Named Entity Analysis and Entity Linking' project report.\n\n\n(국문) 국립국어원(2023). 국립국어원 개체명 분석 말뭉치 2022(버전 1.1) URL: URL \n\n(Eng) National Institute of Korean Language(2023). NIKL Named Entity Corpus 2022 (v.1.1) URL: URL"
] |
21aa0ef1a01cd7e866648351b87f7eabb97fe553 |
# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_glaive_assistant_v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mwitiderrick/open_llama_3b_glaive_assistant_v0.1](https://huggingface.co/mwitiderrick/open_llama_3b_glaive_assistant_v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mwitiderrick__open_llama_3b_glaive_assistant_v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T15:30:58.556740](https://huggingface.co/datasets/open-llm-leaderboard/details_mwitiderrick__open_llama_3b_glaive_assistant_v0.1/blob/main/results_2023-12-24T15-30-58.556740.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2843747535406573,
"acc_stderr": 0.031689110133124844,
"acc_norm": 0.28633888958645765,
"acc_norm_stderr": 0.03246963675970039,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871114,
"mc2": 0.3585983664640556,
"mc2_stderr": 0.013742745779138914
},
"harness|arc:challenge|25": {
"acc": 0.3703071672354949,
"acc_stderr": 0.01411129875167495,
"acc_norm": 0.4069965870307167,
"acc_norm_stderr": 0.014356399418009131
},
"harness|hellaswag|10": {
"acc": 0.4971121290579566,
"acc_stderr": 0.004989698183207831,
"acc_norm": 0.6744672376020713,
"acc_norm_stderr": 0.004676159299105414
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.03885004245800254,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.03885004245800254
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.29056603773584905,
"acc_stderr": 0.027943219989337156,
"acc_norm": 0.29056603773584905,
"acc_norm_stderr": 0.027943219989337156
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.035676037996391685,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.035676037996391685
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.030363582197238167,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.030363582197238167
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746325,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113932,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113932
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.03619604524124252,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.03619604524124252
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233484,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233484
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.035887028128263714,
"acc_norm": 0.15,
"acc_norm_stderr": 0.035887028128263714
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.034531318018854146,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.034531318018854146
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.31313131313131315,
"acc_stderr": 0.03304205087813653,
"acc_norm": 0.31313131313131315,
"acc_norm_stderr": 0.03304205087813653
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.03221024508041154,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.03221024508041154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.32564102564102565,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.32564102564102565,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2773109243697479,
"acc_stderr": 0.029079374539480007,
"acc_norm": 0.2773109243697479,
"acc_norm_stderr": 0.029079374539480007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360383,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360383
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.27339449541284405,
"acc_stderr": 0.019109299846098275,
"acc_norm": 0.27339449541284405,
"acc_norm_stderr": 0.019109299846098275
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.028304657943035303,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.028304657943035303
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3452914798206278,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.3452914798206278,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2975206611570248,
"acc_stderr": 0.041733491480834974,
"acc_norm": 0.2975206611570248,
"acc_norm_stderr": 0.041733491480834974
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467764,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467764
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.02905858830374884,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.02905858830374884
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28991060025542786,
"acc_stderr": 0.016225017944770957,
"acc_norm": 0.28991060025542786,
"acc_norm_stderr": 0.016225017944770957
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.02353292543104428,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.02353292543104428
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.025457756696667878,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.025457756696667878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3183279742765273,
"acc_stderr": 0.026457225067811032,
"acc_norm": 0.3183279742765273,
"acc_norm_stderr": 0.026457225067811032
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2839506172839506,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.2839506172839506,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.025892151156709405,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.025892151156709405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.242503259452412,
"acc_stderr": 0.01094657096634878,
"acc_norm": 0.242503259452412,
"acc_norm_stderr": 0.01094657096634878
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4522058823529412,
"acc_stderr": 0.030233758551596455,
"acc_norm": 0.4522058823529412,
"acc_norm_stderr": 0.030233758551596455
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.017401816711427657,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.017401816711427657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.21224489795918366,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.21224489795918366,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729601,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729601
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871114,
"mc2": 0.3585983664640556,
"mc2_stderr": 0.013742745779138914
},
"harness|winogrande|5": {
"acc": 0.6471981057616417,
"acc_stderr": 0.013429728101788961
},
"harness|gsm8k|5": {
"acc": 0.019711902956785442,
"acc_stderr": 0.003828982978735702
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_mwitiderrick__open_llama_3b_glaive_assistant_v0.1 | [
"region:us"
] | 2023-12-24T15:32:41+00:00 | {"pretty_name": "Evaluation run of mwitiderrick/open_llama_3b_glaive_assistant_v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [mwitiderrick/open_llama_3b_glaive_assistant_v0.1](https://huggingface.co/mwitiderrick/open_llama_3b_glaive_assistant_v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mwitiderrick__open_llama_3b_glaive_assistant_v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-24T15:30:58.556740](https://huggingface.co/datasets/open-llm-leaderboard/details_mwitiderrick__open_llama_3b_glaive_assistant_v0.1/blob/main/results_2023-12-24T15-30-58.556740.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2843747535406573,\n \"acc_stderr\": 0.031689110133124844,\n \"acc_norm\": 0.28633888958645765,\n \"acc_norm_stderr\": 0.03246963675970039,\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871114,\n \"mc2\": 0.3585983664640556,\n \"mc2_stderr\": 0.013742745779138914\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3703071672354949,\n \"acc_stderr\": 0.01411129875167495,\n \"acc_norm\": 0.4069965870307167,\n \"acc_norm_stderr\": 0.014356399418009131\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4971121290579566,\n \"acc_stderr\": 0.004989698183207831,\n \"acc_norm\": 0.6744672376020713,\n \"acc_norm_stderr\": 0.004676159299105414\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.03885004245800254,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.03885004245800254\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.29056603773584905,\n \"acc_stderr\": 0.027943219989337156,\n \"acc_norm\": 0.29056603773584905,\n \"acc_norm_stderr\": 0.027943219989337156\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n \"acc_stderr\": 0.035676037996391685,\n \"acc_norm\": 0.3236994219653179,\n \"acc_norm_stderr\": 0.035676037996391685\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.030363582197238167,\n \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.030363582197238167\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746325\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113932,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113932\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n \"acc_stderr\": 0.03619604524124252,\n \"acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.03619604524124252\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.2645161290322581,\n \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.15,\n \"acc_stderr\": 0.035887028128263714,\n \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.035887028128263714\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.034531318018854146,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.034531318018854146\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.31313131313131315,\n \"acc_stderr\": 0.03304205087813653,\n \"acc_norm\": 0.31313131313131315,\n \"acc_norm_stderr\": 0.03304205087813653\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.03221024508041154,\n \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.03221024508041154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.32564102564102565,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.32564102564102565,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2773109243697479,\n \"acc_stderr\": 0.029079374539480007,\n \"acc_norm\": 0.2773109243697479,\n \"acc_norm_stderr\": 0.029079374539480007\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360383,\n \"acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360383\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.27339449541284405,\n \"acc_stderr\": 0.019109299846098275,\n \"acc_norm\": 0.27339449541284405,\n \"acc_norm_stderr\": 0.019109299846098275\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035303,\n \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035303\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3452914798206278,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.3452914798206278,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.03641297081313728,\n \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.03641297081313728\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2975206611570248,\n \"acc_stderr\": 0.041733491480834974,\n \"acc_norm\": 0.2975206611570248,\n \"acc_norm_stderr\": 0.041733491480834974\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n \"acc_stderr\": 0.04203277291467764,\n \"acc_norm\": 0.26785714285714285,\n \"acc_norm_stderr\": 0.04203277291467764\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.02905858830374884,\n \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.02905858830374884\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28991060025542786,\n \"acc_stderr\": 0.016225017944770957,\n \"acc_norm\": 0.28991060025542786,\n \"acc_norm_stderr\": 0.016225017944770957\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.02353292543104428,\n \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.02353292543104428\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.27124183006535946,\n \"acc_stderr\": 0.025457756696667878,\n \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.025457756696667878\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3183279742765273,\n \"acc_stderr\": 0.026457225067811032,\n \"acc_norm\": 0.3183279742765273,\n \"acc_norm_stderr\": 0.026457225067811032\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2839506172839506,\n \"acc_stderr\": 0.025089478523765134,\n \"acc_norm\": 0.2839506172839506,\n \"acc_norm_stderr\": 0.025089478523765134\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25177304964539005,\n \"acc_stderr\": 0.025892151156709405,\n \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.025892151156709405\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.242503259452412,\n \"acc_stderr\": 0.01094657096634878,\n \"acc_norm\": 0.242503259452412,\n \"acc_norm_stderr\": 0.01094657096634878\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4522058823529412,\n \"acc_stderr\": 0.030233758551596455,\n \"acc_norm\": 0.4522058823529412,\n \"acc_norm_stderr\": 0.030233758551596455\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.017401816711427657,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.017401816711427657\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.21224489795918366,\n \"acc_stderr\": 0.026176967197866767,\n \"acc_norm\": 0.21224489795918366,\n \"acc_norm_stderr\": 0.026176967197866767\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n \"acc_stderr\": 0.03076944496729601,\n \"acc_norm\": 0.2537313432835821,\n \"acc_norm_stderr\": 0.03076944496729601\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.0340105262010409,\n \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.0340105262010409\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871114,\n \"mc2\": 0.3585983664640556,\n \"mc2_stderr\": 0.013742745779138914\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6471981057616417,\n \"acc_stderr\": 0.013429728101788961\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.019711902956785442,\n \"acc_stderr\": 0.003828982978735702\n }\n}\n```", "repo_url": "https://huggingface.co/mwitiderrick/open_llama_3b_glaive_assistant_v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|arc:challenge|25_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|gsm8k|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hellaswag|10_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T15-30-58.556740.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["**/details_harness|winogrande|5_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-24T15-30-58.556740.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_24T15_30_58.556740", "path": ["results_2023-12-24T15-30-58.556740.parquet"]}, {"split": "latest", "path": ["results_2023-12-24T15-30-58.556740.parquet"]}]}]} | 2023-12-24T15:33:06+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_glaive_assistant_v0.1
Dataset automatically created during the evaluation run of model mwitiderrick/open_llama_3b_glaive_assistant_v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-24T15:30:58.556740(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_glaive_assistant_v0.1\n\n\n\nDataset automatically created during the evaluation run of model mwitiderrick/open_llama_3b_glaive_assistant_v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T15:30:58.556740(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_glaive_assistant_v0.1\n\n\n\nDataset automatically created during the evaluation run of model mwitiderrick/open_llama_3b_glaive_assistant_v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T15:30:58.556740(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
203,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_glaive_assistant_v0.1\n\n\n\nDataset automatically created during the evaluation run of model mwitiderrick/open_llama_3b_glaive_assistant_v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-24T15:30:58.556740(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
b7802366c67efa3535db0df7b30c045814dd3b4c |
# Dataset Card for Evaluation run of perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE](https://huggingface.co/perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_perlthoughts__Mistral-7B-Instruct-v0.2-2x7B-MoE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T15:33:14.628104](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Mistral-7B-Instruct-v0.2-2x7B-MoE/blob/main/results_2023-12-24T15-33-14.628104.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6073435568644537,
"acc_stderr": 0.03313530519533436,
"acc_norm": 0.6118855098653408,
"acc_norm_stderr": 0.03380762825921495,
"mc1": 0.5275397796817626,
"mc1_stderr": 0.01747693019071219,
"mc2": 0.6818136388417556,
"mc2_stderr": 0.015193094432096838
},
"harness|arc:challenge|25": {
"acc": 0.5887372013651877,
"acc_stderr": 0.014379441068522084,
"acc_norm": 0.6296928327645052,
"acc_norm_stderr": 0.01411129875167495
},
"harness|hellaswag|10": {
"acc": 0.6679944234216292,
"acc_stderr": 0.004699705280976588,
"acc_norm": 0.8488348934475204,
"acc_norm_stderr": 0.003574776594108505
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.027379871229943245,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.027379871229943245
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306443,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.558974358974359,
"acc_stderr": 0.025174048384000745,
"acc_norm": 0.558974358974359,
"acc_norm_stderr": 0.025174048384000745
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.01732435232501601,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.01732435232501601
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145624,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145624
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7828863346104725,
"acc_stderr": 0.014743125394823297,
"acc_norm": 0.7828863346104725,
"acc_norm_stderr": 0.014743125394823297
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3206703910614525,
"acc_stderr": 0.015609929559348406,
"acc_norm": 0.3206703910614525,
"acc_norm_stderr": 0.015609929559348406
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.026568921015457138,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.026568921015457138
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890172,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890172
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.02970045324729146,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.02970045324729146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4361147327249022,
"acc_stderr": 0.012665568135455333,
"acc_norm": 0.4361147327249022,
"acc_norm_stderr": 0.012665568135455333
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6356209150326797,
"acc_stderr": 0.019469518221573705,
"acc_norm": 0.6356209150326797,
"acc_norm_stderr": 0.019469518221573705
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919797,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919797
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333047,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333047
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5275397796817626,
"mc1_stderr": 0.01747693019071219,
"mc2": 0.6818136388417556,
"mc2_stderr": 0.015193094432096838
},
"harness|winogrande|5": {
"acc": 0.7742699289660616,
"acc_stderr": 0.011749626260902547
},
"harness|gsm8k|5": {
"acc": 0.39423805913570886,
"acc_stderr": 0.013460852357095656
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_perlthoughts__Mistral-7B-Instruct-v0.2-2x7B-MoE | [
"region:us"
] | 2023-12-24T15:35:31+00:00 | {"pretty_name": "Evaluation run of perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE", "dataset_summary": "Dataset automatically created during the evaluation run of model [perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE](https://huggingface.co/perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Mistral-7B-Instruct-v0.2-2x7B-MoE\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-24T15:33:14.628104](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Mistral-7B-Instruct-v0.2-2x7B-MoE/blob/main/results_2023-12-24T15-33-14.628104.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6073435568644537,\n \"acc_stderr\": 0.03313530519533436,\n \"acc_norm\": 0.6118855098653408,\n \"acc_norm_stderr\": 0.03380762825921495,\n \"mc1\": 0.5275397796817626,\n \"mc1_stderr\": 0.01747693019071219,\n \"mc2\": 0.6818136388417556,\n \"mc2_stderr\": 0.015193094432096838\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522084,\n \"acc_norm\": 0.6296928327645052,\n \"acc_norm_stderr\": 0.01411129875167495\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6679944234216292,\n \"acc_stderr\": 0.004699705280976588,\n \"acc_norm\": 0.8488348934475204,\n \"acc_norm_stderr\": 0.003574776594108505\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.41228070175438597,\n \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n \"acc_stderr\": 0.027379871229943245,\n \"acc_norm\": 0.635483870967742,\n \"acc_norm_stderr\": 0.027379871229943245\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306443,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.558974358974359,\n \"acc_stderr\": 0.025174048384000745,\n \"acc_norm\": 0.558974358974359,\n \"acc_norm_stderr\": 0.025174048384000745\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7944954128440367,\n \"acc_stderr\": 0.01732435232501601,\n \"acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.01732435232501601\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145624,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145624\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n \"acc_stderr\": 0.014743125394823297,\n \"acc_norm\": 0.7828863346104725,\n \"acc_norm_stderr\": 0.014743125394823297\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3206703910614525,\n \"acc_stderr\": 0.015609929559348406,\n \"acc_norm\": 0.3206703910614525,\n \"acc_norm_stderr\": 0.015609929559348406\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.026568921015457138,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.026568921015457138\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890172,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890172\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.02970045324729146,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.02970045324729146\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4361147327249022,\n \"acc_stderr\": 0.012665568135455333,\n \"acc_norm\": 0.4361147327249022,\n \"acc_norm_stderr\": 0.012665568135455333\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02952009569768776,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02952009569768776\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6356209150326797,\n \"acc_stderr\": 0.019469518221573705,\n \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.019469518221573705\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n \"acc_stderr\": 0.03187187537919797,\n \"acc_norm\": 0.7164179104477612,\n \"acc_norm_stderr\": 0.03187187537919797\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5275397796817626,\n \"mc1_stderr\": 0.01747693019071219,\n \"mc2\": 0.6818136388417556,\n \"mc2_stderr\": 0.015193094432096838\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7742699289660616,\n \"acc_stderr\": 0.011749626260902547\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39423805913570886,\n \"acc_stderr\": 0.013460852357095656\n }\n}\n```", "repo_url": "https://huggingface.co/perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|arc:challenge|25_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|gsm8k|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hellaswag|10_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T15-33-14.628104.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["**/details_harness|winogrande|5_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-24T15-33-14.628104.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_24T15_33_14.628104", "path": ["results_2023-12-24T15-33-14.628104.parquet"]}, {"split": "latest", "path": ["results_2023-12-24T15-33-14.628104.parquet"]}]}]} | 2023-12-24T15:35:55+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE
Dataset automatically created during the evaluation run of model perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-24T15:33:14.628104(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE\n\n\n\nDataset automatically created during the evaluation run of model perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T15:33:14.628104(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE\n\n\n\nDataset automatically created during the evaluation run of model perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T15:33:14.628104(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
205,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE\n\n\n\nDataset automatically created during the evaluation run of model perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-24T15:33:14.628104(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]"
] |
872827c59eb564f16c1a282695db211dd44a2fc0 |
# Hinglish Instruct Dataset using Self Instruct method
The prompt used for generating the samples:
```
You are asked to come up with a set of 50 diverse task instructions in Hinglish or Hindi.
These task instructions will be given to a GPT model and we will evaluate the GPT model for completing the instructions.
Here are the requirements:
1. Try not to repeat the verb for each instruction to maximize diversity.
2. The language used for the instruction also should be diverse. For example, you should combine questions with imperative instructions.
3. The type of instructions should be diverse. The list should include diverse types of tasks like open-ended generation, classification, editing, etc.
2. A GPT language model should be able to complete the instruction. For example, do not ask the assistant to create any visual or audio output. For another example, do not ask the assistant to wake you up at 5pm or set a reminder because it cannot perform any action.
3. The instructions should be in Hinglish or Hindi.
4. The instructions should be 1 to 2 sentences long. Either an imperative sentence or a question is permitted.
5. You should generate an appropriate input to the instruction. The input field should contain a specific example provided for the instruction. It should involve realistic data and should not contain simple placeholders. The input should provide substantial content to make the instruction challenging but should ideally not exceed 100 words.
6. Not all instructions require input. For example, when an instruction asks about some general information, "what is the highest peak in the world", it is not necessary to provide a specific context. In this case, we simply put "<noinput>" in the input field.
7. The output should be an appropriate response to the instruction and the input. Make sure the output is less than 500 words.
8. Provide complete instruction, input and output without placeholders.
9. Example format of the output task:
Instruction: किसी भी एक भारतीय त्योहार के बारे में बताएं।
Input: <noinput>
Output: दीपावली, जिसे दिवाली भी कहा जाता है, प्रकाश का त्योहार है। इस दिन, लोग अपने घरों में दीये जलाते हैं, पूजा करते हैं, और मिठाईयाँ बांटते हैं।
List of 50 tasks:
```
Note:
1. The instruction "Provide complete instruction, input and output without placeholders." was important else GPT-4 especially was **very lazy** and just gave placeholders for the outputs.
2. Most of the dataset is generated using GPT-3.5 Turbo while some part of it is generated using GPT-4. Most of the dataset is in Hinglish while some part of it is in Hindi.
3. The prompt template is adapted from the Alpaca GitHub repo https://github.com/tatsu-lab/stanford_alpaca/blob/main/prompt.txt | smangrul/hinglish_self_instruct_v0 | [
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:hi",
"language:en",
"region:us"
] | 2023-12-24T16:02:39+00:00 | {"language": ["hi", "en"], "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"], "dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "category", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 251497, "num_examples": 1018}], "download_size": 124371, "dataset_size": 251497}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-24T16:26:45+00:00 | [] | [
"hi",
"en"
] | TAGS
#task_categories-text-generation #size_categories-1K<n<10K #language-Hindi #language-English #region-us
|
# Hinglish Instruct Dataset using Self Instruct method
The prompt used for generating the samples:
Note:
1. The instruction "Provide complete instruction, input and output without placeholders." was important else GPT-4 especially was very lazy and just gave placeholders for the outputs.
2. Most of the dataset is generated using GPT-3.5 Turbo while some part of it is generated using GPT-4. Most of the dataset is in Hinglish while some part of it is in Hindi.
3. The prompt template is adapted from the Alpaca GitHub repo URL | [
"# Hinglish Instruct Dataset using Self Instruct method\n\nThe prompt used for generating the samples:\n\n\n\nNote: \n1. The instruction \"Provide complete instruction, input and output without placeholders.\" was important else GPT-4 especially was very lazy and just gave placeholders for the outputs.\n2. Most of the dataset is generated using GPT-3.5 Turbo while some part of it is generated using GPT-4. Most of the dataset is in Hinglish while some part of it is in Hindi.\n3. The prompt template is adapted from the Alpaca GitHub repo URL"
] | [
"TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Hindi #language-English #region-us \n",
"# Hinglish Instruct Dataset using Self Instruct method\n\nThe prompt used for generating the samples:\n\n\n\nNote: \n1. The instruction \"Provide complete instruction, input and output without placeholders.\" was important else GPT-4 especially was very lazy and just gave placeholders for the outputs.\n2. Most of the dataset is generated using GPT-3.5 Turbo while some part of it is generated using GPT-4. Most of the dataset is in Hinglish while some part of it is in Hindi.\n3. The prompt template is adapted from the Alpaca GitHub repo URL"
] | [
37,
129
] | [
"passage: TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Hindi #language-English #region-us \n# Hinglish Instruct Dataset using Self Instruct method\n\nThe prompt used for generating the samples:\n\n\n\nNote: \n1. The instruction \"Provide complete instruction, input and output without placeholders.\" was important else GPT-4 especially was very lazy and just gave placeholders for the outputs.\n2. Most of the dataset is generated using GPT-3.5 Turbo while some part of it is generated using GPT-4. Most of the dataset is in Hinglish while some part of it is in Hindi.\n3. The prompt template is adapted from the Alpaca GitHub repo URL"
] |
e01055a25f1e641b4c1a428b3d213e641b9974a6 | # Dataset Card for "summarize_from_feedback_oai_preprocessing_pythia-160m_53"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | vwxyzjn/summarize_from_feedback_oai_preprocessing_pythia-160m_53 | [
"region:us"
] | 2023-12-24T16:09:05+00:00 | {"dataset_info": {"features": [{"name": "info", "struct": [{"name": "id", "dtype": "string"}, {"name": "post", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "subreddit", "dtype": "string"}, {"name": "site", "dtype": "string"}, {"name": "article", "dtype": "string"}]}, {"name": "summaries", "list": [{"name": "text", "dtype": "string"}, {"name": "policy", "dtype": "string"}, {"name": "note", "dtype": "string"}]}, {"name": "choice", "dtype": "int32"}, {"name": "worker", "dtype": "string"}, {"name": "batch", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "extra", "struct": [{"name": "confidence", "dtype": "int32"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query", "dtype": "string"}, {"name": "response0", "dtype": "string"}, {"name": "response0_token", "sequence": "int64"}, {"name": "response0_token_len", "dtype": "int64"}, {"name": "response1", "dtype": "string"}, {"name": "response1_token", "sequence": "int64"}, {"name": "response1_token_len", "dtype": "int64"}, {"name": "response0_policy", "dtype": "string"}, {"name": "response1_policy", "dtype": "string"}, {"name": "policies", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 797913299, "num_examples": 92858}, {"name": "validation", "num_bytes": 750011499, "num_examples": 86086}], "download_size": 127088280, "dataset_size": 1547924798}} | 2023-12-24T16:09:34+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "summarize_from_feedback_oai_preprocessing_pythia-160m_53"
More Information needed | [
"# Dataset Card for \"summarize_from_feedback_oai_preprocessing_pythia-160m_53\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"summarize_from_feedback_oai_preprocessing_pythia-160m_53\"\n\nMore Information needed"
] | [
6,
33
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"summarize_from_feedback_oai_preprocessing_pythia-160m_53\"\n\nMore Information needed"
] |
af4b8ca2d145cb27959d0e83b77daa38b57368e9 |
# Dataset Card for Evaluation run of Mihaiii/Metis-0.3-merged
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Mihaiii/Metis-0.3-merged](https://huggingface.co/Mihaiii/Metis-0.3-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mihaiii__Metis-0.3-merged",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T16:37:24.768946](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Metis-0.3-merged/blob/main/results_2023-12-24T16-37-24.768946.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6222040509450919,
"acc_stderr": 0.03268902421558277,
"acc_norm": 0.630054662201999,
"acc_norm_stderr": 0.0333854076462143,
"mc1": 0.43084455324357407,
"mc1_stderr": 0.017335272475332366,
"mc2": 0.5923566084998495,
"mc2_stderr": 0.015555842162231328
},
"harness|arc:challenge|25": {
"acc": 0.5887372013651877,
"acc_stderr": 0.014379441068522082,
"acc_norm": 0.6220136518771331,
"acc_norm_stderr": 0.014169664520303098
},
"harness|hellaswag|10": {
"acc": 0.6547500497908784,
"acc_stderr": 0.004744780201276635,
"acc_norm": 0.8399721171081458,
"acc_norm_stderr": 0.0036588262081016063
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03782728980865469,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03782728980865469
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.04966570903978529,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.04966570903978529
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155243,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155243
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187208,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187208
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.017266742087630793,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.017266742087630793
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251745,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251745
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.03351953879521269,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.03351953879521269
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296422,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296422
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.024685316867257803,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.024685316867257803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.376536312849162,
"acc_stderr": 0.016204672385106603,
"acc_norm": 0.376536312849162,
"acc_norm_stderr": 0.016204672385106603
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279056,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279056
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.019393058402355435,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.019393058402355435
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417465,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072766,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072766
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43084455324357407,
"mc1_stderr": 0.017335272475332366,
"mc2": 0.5923566084998495,
"mc2_stderr": 0.015555842162231328
},
"harness|winogrande|5": {
"acc": 0.7813733228097869,
"acc_stderr": 0.011616198215773211
},
"harness|gsm8k|5": {
"acc": 0.21834723275208492,
"acc_stderr": 0.011379497266738047
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Mihaiii__Metis-0.3-merged | [
"region:us"
] | 2023-12-24T16:39:41+00:00 | {"pretty_name": "Evaluation run of Mihaiii/Metis-0.3-merged", "dataset_summary": "Dataset automatically created during the evaluation run of model [Mihaiii/Metis-0.3-merged](https://huggingface.co/Mihaiii/Metis-0.3-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mihaiii__Metis-0.3-merged\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-24T16:37:24.768946](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Metis-0.3-merged/blob/main/results_2023-12-24T16-37-24.768946.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6222040509450919,\n \"acc_stderr\": 0.03268902421558277,\n \"acc_norm\": 0.630054662201999,\n \"acc_norm_stderr\": 0.0333854076462143,\n \"mc1\": 0.43084455324357407,\n \"mc1_stderr\": 0.017335272475332366,\n \"mc2\": 0.5923566084998495,\n \"mc2_stderr\": 0.015555842162231328\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522082,\n \"acc_norm\": 0.6220136518771331,\n \"acc_norm_stderr\": 0.014169664520303098\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6547500497908784,\n \"acc_stderr\": 0.004744780201276635,\n \"acc_norm\": 0.8399721171081458,\n \"acc_norm_stderr\": 0.0036588262081016063\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.04966570903978529,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.04966570903978529\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155243,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155243\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630793,\n \"acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630793\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251745,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251745\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.03351953879521269,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.03351953879521269\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n \"acc_stderr\": 0.014283378044296422,\n \"acc_norm\": 0.8007662835249042,\n \"acc_norm_stderr\": 0.014283378044296422\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.024685316867257803,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.024685316867257803\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n \"acc_stderr\": 0.016204672385106603,\n \"acc_norm\": 0.376536312849162,\n \"acc_norm_stderr\": 0.016204672385106603\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.02532988817190092,\n \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.02532988817190092\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6421568627450981,\n \"acc_stderr\": 0.019393058402355435,\n \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.019393058402355435\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417465,\n \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072766,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072766\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43084455324357407,\n \"mc1_stderr\": 0.017335272475332366,\n \"mc2\": 0.5923566084998495,\n \"mc2_stderr\": 0.015555842162231328\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.011616198215773211\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.21834723275208492,\n \"acc_stderr\": 0.011379497266738047\n }\n}\n```", "repo_url": "https://huggingface.co/Mihaiii/Metis-0.3-merged", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|arc:challenge|25_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|gsm8k|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hellaswag|10_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T16-37-24.768946.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["**/details_harness|winogrande|5_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-24T16-37-24.768946.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_24T16_37_24.768946", "path": ["results_2023-12-24T16-37-24.768946.parquet"]}, {"split": "latest", "path": ["results_2023-12-24T16-37-24.768946.parquet"]}]}]} | 2023-12-24T16:40:03+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Mihaiii/Metis-0.3-merged
Dataset automatically created during the evaluation run of model Mihaiii/Metis-0.3-merged on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-24T16:37:24.768946(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Mihaiii/Metis-0.3-merged\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Metis-0.3-merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T16:37:24.768946(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Mihaiii/Metis-0.3-merged\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Metis-0.3-merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T16:37:24.768946(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Mihaiii/Metis-0.3-merged\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Metis-0.3-merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-24T16:37:24.768946(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
d017b35ec2339179bc24e8e0aa04bd8cf3525e82 | first 10 halo books, then some fanfics | Superintendent/halo-lore | [
"region:us"
] | 2023-12-24T17:11:17+00:00 | {} | 2023-12-30T00:21:43+00:00 | [] | [] | TAGS
#region-us
| first 10 halo books, then some fanfics | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
393698dd75054be5a78494969f08f81dc27c0ba5 |
# Dataset Card for Evaluation run of fblgit/una-llama-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fblgit/una-llama-7b](https://huggingface.co/fblgit/una-llama-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fblgit__una-llama-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T17:39:22.935807](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__una-llama-7b/blob/main/results_2023-12-24T17-39-22.935807.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.380732359519661,
"acc_stderr": 0.034121438696938955,
"acc_norm": 0.3837076059961357,
"acc_norm_stderr": 0.03491178592677983,
"mc1": 0.2558139534883721,
"mc1_stderr": 0.01527417621928336,
"mc2": 0.38012253018489384,
"mc2_stderr": 0.014122907654663121
},
"harness|arc:challenge|25": {
"acc": 0.49658703071672355,
"acc_stderr": 0.014611050403244084,
"acc_norm": 0.5366894197952219,
"acc_norm_stderr": 0.014572000527756986
},
"harness|hellaswag|10": {
"acc": 0.599681338378809,
"acc_stderr": 0.004889615413144194,
"acc_norm": 0.8007369049990042,
"acc_norm_stderr": 0.003986299037840092
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3815789473684211,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.3815789473684211,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4037735849056604,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.4037735849056604,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3263888888888889,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.3263888888888889,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3574468085106383,
"acc_stderr": 0.03132941789476425,
"acc_norm": 0.3574468085106383,
"acc_norm_stderr": 0.03132941789476425
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159393,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159393
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.023266512213730585,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.023266512213730585
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3774193548387097,
"acc_stderr": 0.02757596072327823,
"acc_norm": 0.3774193548387097,
"acc_norm_stderr": 0.02757596072327823
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.03194740072265541,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.03194740072265541
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.03851716319398393,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.03851716319398393
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.40404040404040403,
"acc_stderr": 0.03496130972056128,
"acc_norm": 0.40404040404040403,
"acc_norm_stderr": 0.03496130972056128
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.48704663212435234,
"acc_stderr": 0.03607228061047749,
"acc_norm": 0.48704663212435234,
"acc_norm_stderr": 0.03607228061047749
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02443301646605245,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02443301646605245
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.025348097468097856,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.025348097468097856
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.37815126050420167,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.37815126050420167,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.48990825688073397,
"acc_stderr": 0.021432956203453327,
"acc_norm": 0.48990825688073397,
"acc_norm_stderr": 0.021432956203453327
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.03344887382997866,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.03344887382997866
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.03465868196380758,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.03465868196380758
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.46835443037974683,
"acc_stderr": 0.03248197400511074,
"acc_norm": 0.46835443037974683,
"acc_norm_stderr": 0.03248197400511074
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3721973094170404,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.3721973094170404,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3893129770992366,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.3893129770992366,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5041322314049587,
"acc_stderr": 0.04564198767432754,
"acc_norm": 0.5041322314049587,
"acc_norm_stderr": 0.04564198767432754
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.047323326159788154,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.047323326159788154
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44171779141104295,
"acc_stderr": 0.03901591825836184,
"acc_norm": 0.44171779141104295,
"acc_norm_stderr": 0.03901591825836184
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.3883495145631068,
"acc_stderr": 0.04825729337356389,
"acc_norm": 0.3883495145631068,
"acc_norm_stderr": 0.04825729337356389
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5384615384615384,
"acc_stderr": 0.03265903381186194,
"acc_norm": 0.5384615384615384,
"acc_norm_stderr": 0.03265903381186194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.01778403453499245,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.01778403453499245
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.026483392042098177,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.026483392042098177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4150326797385621,
"acc_stderr": 0.0282135041778241,
"acc_norm": 0.4150326797385621,
"acc_norm_stderr": 0.0282135041778241
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4180064308681672,
"acc_stderr": 0.02801365189199507,
"acc_norm": 0.4180064308681672,
"acc_norm_stderr": 0.02801365189199507
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.027237415094592477,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.027237415094592477
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3049645390070922,
"acc_stderr": 0.027464708442022128,
"acc_norm": 0.3049645390070922,
"acc_norm_stderr": 0.027464708442022128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.30638852672750977,
"acc_stderr": 0.011773980329380719,
"acc_norm": 0.30638852672750977,
"acc_norm_stderr": 0.011773980329380719
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0301619119307671,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0301619119307671
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3415032679738562,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.3415032679738562,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.37272727272727274,
"acc_stderr": 0.04631381319425463,
"acc_norm": 0.37272727272727274,
"acc_norm_stderr": 0.04631381319425463
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3795918367346939,
"acc_stderr": 0.031067211262872485,
"acc_norm": 0.3795918367346939,
"acc_norm_stderr": 0.031067211262872485
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5024875621890548,
"acc_stderr": 0.03535490150137289,
"acc_norm": 0.5024875621890548,
"acc_norm_stderr": 0.03535490150137289
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683227,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683227
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.03820042586602966,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.03820042586602966
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2558139534883721,
"mc1_stderr": 0.01527417621928336,
"mc2": 0.38012253018489384,
"mc2_stderr": 0.014122907654663121
},
"harness|winogrande|5": {
"acc": 0.7292817679558011,
"acc_stderr": 0.012487904760626304
},
"harness|gsm8k|5": {
"acc": 0.0978013646702047,
"acc_stderr": 0.008182119821849038
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_fblgit__una-llama-7b | [
"region:us"
] | 2023-12-24T17:41:08+00:00 | {"pretty_name": "Evaluation run of fblgit/una-llama-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [fblgit/una-llama-7b](https://huggingface.co/fblgit/una-llama-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fblgit__una-llama-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-24T17:39:22.935807](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__una-llama-7b/blob/main/results_2023-12-24T17-39-22.935807.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.380732359519661,\n \"acc_stderr\": 0.034121438696938955,\n \"acc_norm\": 0.3837076059961357,\n \"acc_norm_stderr\": 0.03491178592677983,\n \"mc1\": 0.2558139534883721,\n \"mc1_stderr\": 0.01527417621928336,\n \"mc2\": 0.38012253018489384,\n \"mc2_stderr\": 0.014122907654663121\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.49658703071672355,\n \"acc_stderr\": 0.014611050403244084,\n \"acc_norm\": 0.5366894197952219,\n \"acc_norm_stderr\": 0.014572000527756986\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.599681338378809,\n \"acc_stderr\": 0.004889615413144194,\n \"acc_norm\": 0.8007369049990042,\n \"acc_norm_stderr\": 0.003986299037840092\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3815789473684211,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.3815789473684211,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4037735849056604,\n \"acc_stderr\": 0.03019761160019795,\n \"acc_norm\": 0.4037735849056604,\n \"acc_norm_stderr\": 0.03019761160019795\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3263888888888889,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.3263888888888889,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.3352601156069364,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3574468085106383,\n \"acc_stderr\": 0.03132941789476425,\n \"acc_norm\": 0.3574468085106383,\n \"acc_norm_stderr\": 0.03132941789476425\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159393,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159393\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.023266512213730585,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.023266512213730585\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3774193548387097,\n \"acc_stderr\": 0.02757596072327823,\n \"acc_norm\": 0.3774193548387097,\n \"acc_norm_stderr\": 0.02757596072327823\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.03194740072265541,\n \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.03194740072265541\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.41818181818181815,\n \"acc_stderr\": 0.03851716319398393,\n \"acc_norm\": 0.41818181818181815,\n \"acc_norm_stderr\": 0.03851716319398393\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.40404040404040403,\n \"acc_stderr\": 0.03496130972056128,\n \"acc_norm\": 0.40404040404040403,\n \"acc_norm_stderr\": 0.03496130972056128\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.48704663212435234,\n \"acc_stderr\": 0.03607228061047749,\n \"acc_norm\": 0.48704663212435234,\n \"acc_norm_stderr\": 0.03607228061047749\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02443301646605245,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02443301646605245\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.025348097468097856,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.025348097468097856\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.37815126050420167,\n \"acc_stderr\": 0.031499305777849054,\n \"acc_norm\": 0.37815126050420167,\n \"acc_norm_stderr\": 0.031499305777849054\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.48990825688073397,\n \"acc_stderr\": 0.021432956203453327,\n \"acc_norm\": 0.48990825688073397,\n \"acc_norm_stderr\": 0.021432956203453327\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4027777777777778,\n \"acc_stderr\": 0.03344887382997866,\n \"acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.03344887382997866\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.03465868196380758,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.03465868196380758\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.46835443037974683,\n \"acc_stderr\": 0.03248197400511074,\n \"acc_norm\": 0.46835443037974683,\n \"acc_norm_stderr\": 0.03248197400511074\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.3721973094170404,\n \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.3893129770992366,\n \"acc_stderr\": 0.04276486542814591,\n \"acc_norm\": 0.3893129770992366,\n \"acc_norm_stderr\": 0.04276486542814591\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5041322314049587,\n \"acc_stderr\": 0.04564198767432754,\n \"acc_norm\": 0.5041322314049587,\n \"acc_norm_stderr\": 0.04564198767432754\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.047323326159788154,\n \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.047323326159788154\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.44171779141104295,\n \"acc_stderr\": 0.03901591825836184,\n \"acc_norm\": 0.44171779141104295,\n \"acc_norm_stderr\": 0.03901591825836184\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3883495145631068,\n \"acc_stderr\": 0.04825729337356389,\n \"acc_norm\": 0.3883495145631068,\n \"acc_norm_stderr\": 0.04825729337356389\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5384615384615384,\n \"acc_stderr\": 0.03265903381186194,\n \"acc_norm\": 0.5384615384615384,\n \"acc_norm_stderr\": 0.03265903381186194\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.01778403453499245,\n \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.01778403453499245\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.41040462427745666,\n \"acc_stderr\": 0.026483392042098177,\n \"acc_norm\": 0.41040462427745666,\n \"acc_norm_stderr\": 0.026483392042098177\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4150326797385621,\n \"acc_stderr\": 0.0282135041778241,\n \"acc_norm\": 0.4150326797385621,\n \"acc_norm_stderr\": 0.0282135041778241\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4180064308681672,\n \"acc_stderr\": 0.02801365189199507,\n \"acc_norm\": 0.4180064308681672,\n \"acc_norm_stderr\": 0.02801365189199507\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.027237415094592477,\n \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.027237415094592477\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3049645390070922,\n \"acc_stderr\": 0.027464708442022128,\n \"acc_norm\": 0.3049645390070922,\n \"acc_norm_stderr\": 0.027464708442022128\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.30638852672750977,\n \"acc_stderr\": 0.011773980329380719,\n \"acc_norm\": 0.30638852672750977,\n \"acc_norm_stderr\": 0.011773980329380719\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0301619119307671,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0301619119307671\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3415032679738562,\n \"acc_stderr\": 0.019184639328092487,\n \"acc_norm\": 0.3415032679738562,\n \"acc_norm_stderr\": 0.019184639328092487\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.37272727272727274,\n \"acc_stderr\": 0.04631381319425463,\n \"acc_norm\": 0.37272727272727274,\n \"acc_norm_stderr\": 0.04631381319425463\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3795918367346939,\n \"acc_stderr\": 0.031067211262872485,\n \"acc_norm\": 0.3795918367346939,\n \"acc_norm_stderr\": 0.031067211262872485\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5024875621890548,\n \"acc_stderr\": 0.03535490150137289,\n \"acc_norm\": 0.5024875621890548,\n \"acc_norm_stderr\": 0.03535490150137289\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n \"acc_stderr\": 0.03647168523683227,\n \"acc_norm\": 0.3253012048192771,\n \"acc_norm_stderr\": 0.03647168523683227\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.03820042586602966,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.03820042586602966\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2558139534883721,\n \"mc1_stderr\": 0.01527417621928336,\n \"mc2\": 0.38012253018489384,\n \"mc2_stderr\": 0.014122907654663121\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7292817679558011,\n \"acc_stderr\": 0.012487904760626304\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0978013646702047,\n \"acc_stderr\": 0.008182119821849038\n }\n}\n```", "repo_url": "https://huggingface.co/fblgit/una-llama-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|arc:challenge|25_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|gsm8k|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hellaswag|10_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T17-39-22.935807.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["**/details_harness|winogrande|5_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-24T17-39-22.935807.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_24T17_39_22.935807", "path": ["results_2023-12-24T17-39-22.935807.parquet"]}, {"split": "latest", "path": ["results_2023-12-24T17-39-22.935807.parquet"]}]}]} | 2023-12-24T17:41:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of fblgit/una-llama-7b
Dataset automatically created during the evaluation run of model fblgit/una-llama-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-24T17:39:22.935807(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of fblgit/una-llama-7b\n\n\n\nDataset automatically created during the evaluation run of model fblgit/una-llama-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T17:39:22.935807(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of fblgit/una-llama-7b\n\n\n\nDataset automatically created during the evaluation run of model fblgit/una-llama-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T17:39:22.935807(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of fblgit/una-llama-7b\n\n\n\nDataset automatically created during the evaluation run of model fblgit/una-llama-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-24T17:39:22.935807(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
98d998edc81bf182e315d0424134af2e7ef06e48 |
# Nusa-MT
Dataset Collection for Indonesian Machine Translation. The dataset come from following sources:
- ELRC_2922
- GlobalVoices
- News-Commentary
- Tatoeba
- Tico-19
| cahya/nusa-mt | [
"task_categories:translation",
"task_categories:text-generation",
"annotations_creators:no-annotation",
"language_creators:crowdsourced",
"multilinguality:translation",
"size_categories:unknown",
"source_datasets:original",
"language:en",
"language:id",
"license:cc-by-2.0",
"region:us"
] | 2023-12-24T17:53:32+00:00 | {"annotations_creators": ["no-annotation"], "language_creators": ["crowdsourced"], "language": ["en", "id"], "license": ["cc-by-2.0"], "multilinguality": ["translation"], "size_categories": ["unknown"], "source_datasets": ["original"], "task_categories": ["translation", "text-generation"], "pretty_name": "Dataset Collection for Indonesian Machine Translation"} | 2023-12-24T18:52:41+00:00 | [] | [
"en",
"id"
] | TAGS
#task_categories-translation #task_categories-text-generation #annotations_creators-no-annotation #language_creators-crowdsourced #multilinguality-translation #size_categories-unknown #source_datasets-original #language-English #language-Indonesian #license-cc-by-2.0 #region-us
|
# Nusa-MT
Dataset Collection for Indonesian Machine Translation. The dataset come from following sources:
- ELRC_2922
- GlobalVoices
- News-Commentary
- Tatoeba
- Tico-19
| [
"# Nusa-MT\nDataset Collection for Indonesian Machine Translation. The dataset come from following sources:\n- ELRC_2922\n- GlobalVoices\n- News-Commentary\n- Tatoeba\n- Tico-19"
] | [
"TAGS\n#task_categories-translation #task_categories-text-generation #annotations_creators-no-annotation #language_creators-crowdsourced #multilinguality-translation #size_categories-unknown #source_datasets-original #language-English #language-Indonesian #license-cc-by-2.0 #region-us \n",
"# Nusa-MT\nDataset Collection for Indonesian Machine Translation. The dataset come from following sources:\n- ELRC_2922\n- GlobalVoices\n- News-Commentary\n- Tatoeba\n- Tico-19"
] | [
92,
47
] | [
"passage: TAGS\n#task_categories-translation #task_categories-text-generation #annotations_creators-no-annotation #language_creators-crowdsourced #multilinguality-translation #size_categories-unknown #source_datasets-original #language-English #language-Indonesian #license-cc-by-2.0 #region-us \n# Nusa-MT\nDataset Collection for Indonesian Machine Translation. The dataset come from following sources:\n- ELRC_2922\n- GlobalVoices\n- News-Commentary\n- Tatoeba\n- Tico-19"
] |
4003e1625b68aee616728385d81b4f993b41cc5f |
# Dataset Card for Evaluation run of Brillibits/Instruct_Mixtral-8x7B-v0.1_Dolly15K
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Brillibits/Instruct_Mixtral-8x7B-v0.1_Dolly15K](https://huggingface.co/Brillibits/Instruct_Mixtral-8x7B-v0.1_Dolly15K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Brillibits__Instruct_Mixtral-8x7B-v0.1_Dolly15K",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T18:19:49.372068](https://huggingface.co/datasets/open-llm-leaderboard/details_Brillibits__Instruct_Mixtral-8x7B-v0.1_Dolly15K/blob/main/results_2023-12-24T18-19-49.372068.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7084211824111288,
"acc_stderr": 0.030357267841177957,
"acc_norm": 0.712160667482289,
"acc_norm_stderr": 0.030947686399368228,
"mc1": 0.4883720930232558,
"mc1_stderr": 0.017498767175740088,
"mc2": 0.6483307671941028,
"mc2_stderr": 0.01472680612023713
},
"harness|arc:challenge|25": {
"acc": 0.6629692832764505,
"acc_stderr": 0.013813476652902279,
"acc_norm": 0.6928327645051194,
"acc_norm_stderr": 0.013481034054980941
},
"harness|hellaswag|10": {
"acc": 0.6820354511053575,
"acc_stderr": 0.004647338877642187,
"acc_norm": 0.8759211312487553,
"acc_norm_stderr": 0.0032899775233939097
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.03999262876617721,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.03999262876617721
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.03391160934343604,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.03391160934343604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7735849056603774,
"acc_stderr": 0.02575755989310673,
"acc_norm": 0.7735849056603774,
"acc_norm_stderr": 0.02575755989310673
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.03216600808802268,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.03216600808802268
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6553191489361702,
"acc_stderr": 0.03106898596312215,
"acc_norm": 0.6553191489361702,
"acc_norm_stderr": 0.03106898596312215
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6620689655172414,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.6620689655172414,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4947089947089947,
"acc_stderr": 0.02574986828855657,
"acc_norm": 0.4947089947089947,
"acc_norm_stderr": 0.02574986828855657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8419354838709677,
"acc_stderr": 0.020752831511875278,
"acc_norm": 0.8419354838709677,
"acc_norm_stderr": 0.020752831511875278
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5763546798029556,
"acc_stderr": 0.03476725747649037,
"acc_norm": 0.5763546798029556,
"acc_norm_stderr": 0.03476725747649037
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.025190921114603918,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.025190921114603918
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9637305699481865,
"acc_stderr": 0.013492659751295159,
"acc_norm": 0.9637305699481865,
"acc_norm_stderr": 0.013492659751295159
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.02281581309889661,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.02281581309889661
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.030039842454069283,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.030039842454069283
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8067226890756303,
"acc_stderr": 0.025649470265889183,
"acc_norm": 0.8067226890756303,
"acc_norm_stderr": 0.025649470265889183
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944214,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944214
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8807339449541285,
"acc_stderr": 0.013895729292588957,
"acc_norm": 0.8807339449541285,
"acc_norm_stderr": 0.013895729292588957
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568617,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.02164419572795517,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.02164419572795517
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7354260089686099,
"acc_stderr": 0.029605103217038325,
"acc_norm": 0.7354260089686099,
"acc_norm_stderr": 0.029605103217038325
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971716,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971716
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5982142857142857,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.5982142857142857,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573975,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573975
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.01700436856813235,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.01700436856813235
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8812260536398467,
"acc_stderr": 0.011569134791715655,
"acc_norm": 0.8812260536398467,
"acc_norm_stderr": 0.011569134791715655
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48379888268156424,
"acc_stderr": 0.01671372072950102,
"acc_norm": 0.48379888268156424,
"acc_norm_stderr": 0.01671372072950102
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8300653594771242,
"acc_stderr": 0.021505383121231375,
"acc_norm": 0.8300653594771242,
"acc_norm_stderr": 0.021505383121231375
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.797427652733119,
"acc_stderr": 0.022827317491059682,
"acc_norm": 0.797427652733119,
"acc_norm_stderr": 0.022827317491059682
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.02103851777015737,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.02103851777015737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5390070921985816,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.5390070921985816,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5371577574967406,
"acc_stderr": 0.01273492357953206,
"acc_norm": 0.5371577574967406,
"acc_norm_stderr": 0.01273492357953206
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7830882352941176,
"acc_stderr": 0.025035845227711274,
"acc_norm": 0.7830882352941176,
"acc_norm_stderr": 0.025035845227711274
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.017322789207784326,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.017322789207784326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.026358916334904017,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.026358916334904017
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018526,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018526
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646612,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646612
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.02353755765789255,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.02353755765789255
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4883720930232558,
"mc1_stderr": 0.017498767175740088,
"mc2": 0.6483307671941028,
"mc2_stderr": 0.01472680612023713
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498435
},
"harness|gsm8k|5": {
"acc": 0.5943896891584534,
"acc_stderr": 0.013524848894462111
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Brillibits__Instruct_Mixtral-8x7B-v0.1_Dolly15K | [
"region:us"
] | 2023-12-24T18:22:10+00:00 | {"pretty_name": "Evaluation run of Brillibits/Instruct_Mixtral-8x7B-v0.1_Dolly15K", "dataset_summary": "Dataset automatically created during the evaluation run of model [Brillibits/Instruct_Mixtral-8x7B-v0.1_Dolly15K](https://huggingface.co/Brillibits/Instruct_Mixtral-8x7B-v0.1_Dolly15K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Brillibits__Instruct_Mixtral-8x7B-v0.1_Dolly15K\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-24T18:19:49.372068](https://huggingface.co/datasets/open-llm-leaderboard/details_Brillibits__Instruct_Mixtral-8x7B-v0.1_Dolly15K/blob/main/results_2023-12-24T18-19-49.372068.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7084211824111288,\n \"acc_stderr\": 0.030357267841177957,\n \"acc_norm\": 0.712160667482289,\n \"acc_norm_stderr\": 0.030947686399368228,\n \"mc1\": 0.4883720930232558,\n \"mc1_stderr\": 0.017498767175740088,\n \"mc2\": 0.6483307671941028,\n \"mc2_stderr\": 0.01472680612023713\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6629692832764505,\n \"acc_stderr\": 0.013813476652902279,\n \"acc_norm\": 0.6928327645051194,\n \"acc_norm_stderr\": 0.013481034054980941\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6820354511053575,\n \"acc_stderr\": 0.004647338877642187,\n \"acc_norm\": 0.8759211312487553,\n \"acc_norm_stderr\": 0.0032899775233939097\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n \"acc_stderr\": 0.03999262876617721,\n \"acc_norm\": 0.6888888888888889,\n \"acc_norm_stderr\": 0.03999262876617721\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.03391160934343604,\n \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.03391160934343604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7735849056603774,\n \"acc_stderr\": 0.02575755989310673,\n \"acc_norm\": 0.7735849056603774,\n \"acc_norm_stderr\": 0.02575755989310673\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n \"acc_stderr\": 0.03216600808802268,\n \"acc_norm\": 0.8194444444444444,\n \"acc_norm_stderr\": 0.03216600808802268\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6553191489361702,\n \"acc_stderr\": 0.03106898596312215,\n \"acc_norm\": 0.6553191489361702,\n \"acc_norm_stderr\": 0.03106898596312215\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6620689655172414,\n \"acc_stderr\": 0.039417076320648906,\n \"acc_norm\": 0.6620689655172414,\n \"acc_norm_stderr\": 0.039417076320648906\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4947089947089947,\n \"acc_stderr\": 0.02574986828855657,\n \"acc_norm\": 0.4947089947089947,\n \"acc_norm_stderr\": 0.02574986828855657\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8419354838709677,\n \"acc_stderr\": 0.020752831511875278,\n \"acc_norm\": 0.8419354838709677,\n \"acc_norm_stderr\": 0.020752831511875278\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5763546798029556,\n \"acc_stderr\": 0.03476725747649037,\n \"acc_norm\": 0.5763546798029556,\n \"acc_norm_stderr\": 0.03476725747649037\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603918,\n \"acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603918\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.013492659751295159,\n \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.013492659751295159\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.717948717948718,\n \"acc_stderr\": 0.02281581309889661,\n \"acc_norm\": 0.717948717948718,\n \"acc_norm_stderr\": 0.02281581309889661\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4148148148148148,\n \"acc_stderr\": 0.030039842454069283,\n \"acc_norm\": 0.4148148148148148,\n \"acc_norm_stderr\": 0.030039842454069283\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8067226890756303,\n \"acc_stderr\": 0.025649470265889183,\n \"acc_norm\": 0.8067226890756303,\n \"acc_norm_stderr\": 0.025649470265889183\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944214,\n \"acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944214\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8807339449541285,\n \"acc_stderr\": 0.013895729292588957,\n \"acc_norm\": 0.8807339449541285,\n \"acc_norm_stderr\": 0.013895729292588957\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568617,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568617\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8734177215189873,\n \"acc_stderr\": 0.02164419572795517,\n \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.02164419572795517\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7354260089686099,\n \"acc_stderr\": 0.029605103217038325,\n \"acc_norm\": 0.7354260089686099,\n \"acc_norm_stderr\": 0.029605103217038325\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971716,\n \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971716\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5982142857142857,\n \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.5982142857142857,\n \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573975,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573975\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n \"acc_stderr\": 0.01700436856813235,\n \"acc_norm\": 0.9273504273504274,\n \"acc_norm_stderr\": 0.01700436856813235\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8812260536398467,\n \"acc_stderr\": 0.011569134791715655,\n \"acc_norm\": 0.8812260536398467,\n \"acc_norm_stderr\": 0.011569134791715655\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48379888268156424,\n \"acc_stderr\": 0.01671372072950102,\n \"acc_norm\": 0.48379888268156424,\n \"acc_norm_stderr\": 0.01671372072950102\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8300653594771242,\n \"acc_stderr\": 0.021505383121231375,\n \"acc_norm\": 0.8300653594771242,\n \"acc_norm_stderr\": 0.021505383121231375\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.797427652733119,\n \"acc_stderr\": 0.022827317491059682,\n \"acc_norm\": 0.797427652733119,\n \"acc_norm_stderr\": 0.022827317491059682\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.02103851777015737,\n \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.02103851777015737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5390070921985816,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.5390070921985816,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5371577574967406,\n \"acc_stderr\": 0.01273492357953206,\n \"acc_norm\": 0.5371577574967406,\n \"acc_norm_stderr\": 0.01273492357953206\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7830882352941176,\n \"acc_stderr\": 0.025035845227711274,\n \"acc_norm\": 0.7830882352941176,\n \"acc_norm_stderr\": 0.025035845227711274\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.017322789207784326,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.017322789207784326\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.026358916334904017,\n \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.026358916334904017\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646612,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646612\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.02353755765789255,\n \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.02353755765789255\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4883720930232558,\n \"mc1_stderr\": 0.017498767175740088,\n \"mc2\": 0.6483307671941028,\n \"mc2_stderr\": 0.01472680612023713\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498435\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5943896891584534,\n \"acc_stderr\": 0.013524848894462111\n }\n}\n```", "repo_url": "https://huggingface.co/Brillibits/Instruct_Mixtral-8x7B-v0.1_Dolly15K", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|arc:challenge|25_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|gsm8k|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hellaswag|10_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T18-19-49.372068.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["**/details_harness|winogrande|5_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-24T18-19-49.372068.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_24T18_19_49.372068", "path": ["results_2023-12-24T18-19-49.372068.parquet"]}, {"split": "latest", "path": ["results_2023-12-24T18-19-49.372068.parquet"]}]}]} | 2023-12-24T18:22:37+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Brillibits/Instruct_Mixtral-8x7B-v0.1_Dolly15K
Dataset automatically created during the evaluation run of model Brillibits/Instruct_Mixtral-8x7B-v0.1_Dolly15K on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-24T18:19:49.372068(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Brillibits/Instruct_Mixtral-8x7B-v0.1_Dolly15K\n\n\n\nDataset automatically created during the evaluation run of model Brillibits/Instruct_Mixtral-8x7B-v0.1_Dolly15K on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T18:19:49.372068(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Brillibits/Instruct_Mixtral-8x7B-v0.1_Dolly15K\n\n\n\nDataset automatically created during the evaluation run of model Brillibits/Instruct_Mixtral-8x7B-v0.1_Dolly15K on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T18:19:49.372068(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
207,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Brillibits/Instruct_Mixtral-8x7B-v0.1_Dolly15K\n\n\n\nDataset automatically created during the evaluation run of model Brillibits/Instruct_Mixtral-8x7B-v0.1_Dolly15K on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-24T18:19:49.372068(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]"
] |
28674a13fccece3bcd2594b0d5e42f7d0d066fee |
## Description
Machine-translated Hebrew version sts-b dataset, with additional records: `augmented` - not-matched records that were generated by weak generative model, and "chatgpt" - paraphrases that were generated by chatgpt accordingly
## Sample
```json
[{'label': 4.666999816894531, 'idx': 13, 'sentence1': 'אדם מקפל פיסת נייר.', 'sentence2': 'מישהו מקפל פיסת נייר.', 'source': 'machine-translated'},
{'label': 0.0, 'idx': 13, 'sentence1': 'אדם מקפל פיסת נייר.', 'sentence2': 'כתב מייל.', 'source': 'augmented'},
{'label': 4.0, 'idx': 13, 'sentence1': 'אדם מקפל פיסת נייר.', 'sentence2': 'אדם מכפיל ניידת נייר.', 'source': 'chatgpt'}]
``` | imvladikon/stsb_he | [
"task_categories:sentence-similarity",
"language:he",
"region:us"
] | 2023-12-24T18:37:56+00:00 | {"language": ["he"], "task_categories": ["sentence-similarity"], "dataset_info": {"features": [{"name": "label", "dtype": "float64"}, {"name": "idx", "dtype": "int64"}, {"name": "sentence1", "dtype": "string"}, {"name": "sentence2", "dtype": "string"}, {"name": "source", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3122340, "num_examples": 14597}, {"name": "validation", "num_bytes": 670209, "num_examples": 3489}], "download_size": 1879461, "dataset_size": 3792549}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2023-12-24T21:27:58+00:00 | [] | [
"he"
] | TAGS
#task_categories-sentence-similarity #language-Hebrew #region-us
|
## Description
Machine-translated Hebrew version sts-b dataset, with additional records: 'augmented' - not-matched records that were generated by weak generative model, and "chatgpt" - paraphrases that were generated by chatgpt accordingly
## Sample
| [
"## Description\n\nMachine-translated Hebrew version sts-b dataset, with additional records: 'augmented' - not-matched records that were generated by weak generative model, and \"chatgpt\" - paraphrases that were generated by chatgpt accordingly",
"## Sample"
] | [
"TAGS\n#task_categories-sentence-similarity #language-Hebrew #region-us \n",
"## Description\n\nMachine-translated Hebrew version sts-b dataset, with additional records: 'augmented' - not-matched records that were generated by weak generative model, and \"chatgpt\" - paraphrases that were generated by chatgpt accordingly",
"## Sample"
] | [
24,
63,
3
] | [
"passage: TAGS\n#task_categories-sentence-similarity #language-Hebrew #region-us \n## Description\n\nMachine-translated Hebrew version sts-b dataset, with additional records: 'augmented' - not-matched records that were generated by weak generative model, and \"chatgpt\" - paraphrases that were generated by chatgpt accordingly## Sample"
] |
5e88e53f232dfa9a1720b168f78ffb27a1004673 |
# Dataset Card for Evaluation run of crumb/apricot-wildflower-20
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [crumb/apricot-wildflower-20](https://huggingface.co/crumb/apricot-wildflower-20) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_crumb__apricot-wildflower-20",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T18:40:43.100930](https://huggingface.co/datasets/open-llm-leaderboard/details_crumb__apricot-wildflower-20/blob/main/results_2023-12-24T18-40-43.100930.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6297386827824386,
"acc_stderr": 0.032369576414532454,
"acc_norm": 0.6363163136666092,
"acc_norm_stderr": 0.033027813944829704,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.01568092936402465,
"mc2": 0.4176270318067943,
"mc2_stderr": 0.014113875719654318
},
"harness|arc:challenge|25": {
"acc": 0.552901023890785,
"acc_stderr": 0.014529380160526842,
"acc_norm": 0.5964163822525598,
"acc_norm_stderr": 0.014337158914268447
},
"harness|hellaswag|10": {
"acc": 0.6138219478191596,
"acc_stderr": 0.004858771963468873,
"acc_norm": 0.8175662218681538,
"acc_norm_stderr": 0.003854123373509104
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105652,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105652
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.024870815251057096,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.024870815251057096
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377563,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377563
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072387,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072387
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.02432173848460235,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.02432173848460235
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295838,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295838
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.03381200005643526,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.03381200005643526
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967407,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808514,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808514
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159462,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159462
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381392,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381392
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2782122905027933,
"acc_stderr": 0.014987325439963554,
"acc_norm": 0.2782122905027933,
"acc_norm_stderr": 0.014987325439963554
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.024288619466046095,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.024288619466046095
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45045632333767927,
"acc_stderr": 0.012707390438502348,
"acc_norm": 0.45045632333767927,
"acc_norm_stderr": 0.012707390438502348
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406752,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406752
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.01568092936402465,
"mc2": 0.4176270318067943,
"mc2_stderr": 0.014113875719654318
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.01166122363764341
},
"harness|gsm8k|5": {
"acc": 0.33965125094768767,
"acc_stderr": 0.013045045067665252
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_crumb__apricot-wildflower-20 | [
"region:us"
] | 2023-12-24T18:42:58+00:00 | {"pretty_name": "Evaluation run of crumb/apricot-wildflower-20", "dataset_summary": "Dataset automatically created during the evaluation run of model [crumb/apricot-wildflower-20](https://huggingface.co/crumb/apricot-wildflower-20) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_crumb__apricot-wildflower-20\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-24T18:40:43.100930](https://huggingface.co/datasets/open-llm-leaderboard/details_crumb__apricot-wildflower-20/blob/main/results_2023-12-24T18-40-43.100930.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6297386827824386,\n \"acc_stderr\": 0.032369576414532454,\n \"acc_norm\": 0.6363163136666092,\n \"acc_norm_stderr\": 0.033027813944829704,\n \"mc1\": 0.2778457772337821,\n \"mc1_stderr\": 0.01568092936402465,\n \"mc2\": 0.4176270318067943,\n \"mc2_stderr\": 0.014113875719654318\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.552901023890785,\n \"acc_stderr\": 0.014529380160526842,\n \"acc_norm\": 0.5964163822525598,\n \"acc_norm_stderr\": 0.014337158914268447\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6138219478191596,\n \"acc_stderr\": 0.004858771963468873,\n \"acc_norm\": 0.8175662218681538,\n \"acc_norm_stderr\": 0.003854123373509104\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105652,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105652\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.024870815251057096,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.024870815251057096\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.02432173848460235,\n \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.02432173848460235\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295838,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295838\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.03381200005643526,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.03381200005643526\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808514,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808514\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159462,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159462\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n \"acc_stderr\": 0.014036945850381392,\n \"acc_norm\": 0.80970625798212,\n \"acc_norm_stderr\": 0.014036945850381392\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2782122905027933,\n \"acc_stderr\": 0.014987325439963554,\n \"acc_norm\": 0.2782122905027933,\n \"acc_norm_stderr\": 0.014987325439963554\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.024288619466046095,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.024288619466046095\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45045632333767927,\n \"acc_stderr\": 0.012707390438502348,\n \"acc_norm\": 0.45045632333767927,\n \"acc_norm_stderr\": 0.012707390438502348\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406752,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406752\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n \"mc1_stderr\": 0.01568092936402465,\n \"mc2\": 0.4176270318067943,\n \"mc2_stderr\": 0.014113875719654318\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.01166122363764341\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.33965125094768767,\n \"acc_stderr\": 0.013045045067665252\n }\n}\n```", "repo_url": "https://huggingface.co/crumb/apricot-wildflower-20", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|arc:challenge|25_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|gsm8k|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hellaswag|10_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T18-40-43.100930.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["**/details_harness|winogrande|5_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-24T18-40-43.100930.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_24T18_40_43.100930", "path": ["results_2023-12-24T18-40-43.100930.parquet"]}, {"split": "latest", "path": ["results_2023-12-24T18-40-43.100930.parquet"]}]}]} | 2023-12-24T18:43:20+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of crumb/apricot-wildflower-20
Dataset automatically created during the evaluation run of model crumb/apricot-wildflower-20 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-24T18:40:43.100930(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of crumb/apricot-wildflower-20\n\n\n\nDataset automatically created during the evaluation run of model crumb/apricot-wildflower-20 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T18:40:43.100930(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of crumb/apricot-wildflower-20\n\n\n\nDataset automatically created during the evaluation run of model crumb/apricot-wildflower-20 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T18:40:43.100930(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
183,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of crumb/apricot-wildflower-20\n\n\n\nDataset automatically created during the evaluation run of model crumb/apricot-wildflower-20 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-24T18:40:43.100930(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
03ade06ea1d81422b225d5a00a4666095f756a58 | I wanted to see what is "toxic".
In fact, it's trash.
I let it be for the courageous one that would like to modify it haha.
Clean view of what was "TOXIC" in english : [here](https://huggingface.co/datasets/Undi95/oasst2_toxic/blob/main/%2BToxic0.5-Spam0.5%2BLangEn-CLEANED.jsonl) | Undi95/oasst2_toxic | [
"region:us"
] | 2023-12-24T19:24:21+00:00 | {} | 2023-12-24T19:56:38+00:00 | [] | [] | TAGS
#region-us
| I wanted to see what is "toxic".
In fact, it's trash.
I let it be for the courageous one that would like to modify it haha.
Clean view of what was "TOXIC" in english : here | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
58332e8379bd0a0ea2ae73d155d1105b336905f6 |
# Dataset Card for Evaluation run of itsliupeng/Mixtral-8x7B-v0.1-top3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [itsliupeng/Mixtral-8x7B-v0.1-top3](https://huggingface.co/itsliupeng/Mixtral-8x7B-v0.1-top3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_itsliupeng__Mixtral-8x7B-v0.1-top3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T20:17:11.492534](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__Mixtral-8x7B-v0.1-top3/blob/main/results_2023-12-24T20-17-11.492534.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7170056917230552,
"acc_stderr": 0.03001643545450993,
"acc_norm": 0.721542301890713,
"acc_norm_stderr": 0.030593708007455218,
"mc1": 0.32313341493268055,
"mc1_stderr": 0.016371836286454607,
"mc2": 0.48575008545428044,
"mc2_stderr": 0.014261044394633108
},
"harness|arc:challenge|25": {
"acc": 0.6356655290102389,
"acc_stderr": 0.014063260279882419,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.013697432466693244
},
"harness|hellaswag|10": {
"acc": 0.6715793666600279,
"acc_stderr": 0.004686789042445367,
"acc_norm": 0.8662617008564031,
"acc_norm_stderr": 0.0033967530276634164
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.03885004245800254,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.03885004245800254
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7735849056603774,
"acc_stderr": 0.025757559893106734,
"acc_norm": 0.7735849056603774,
"acc_norm_stderr": 0.025757559893106734
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8541666666666666,
"acc_stderr": 0.029514245964291766,
"acc_norm": 0.8541666666666666,
"acc_norm_stderr": 0.029514245964291766
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.03414014007044036,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.03414014007044036
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7191489361702128,
"acc_stderr": 0.029379170464124825,
"acc_norm": 0.7191489361702128,
"acc_norm_stderr": 0.029379170464124825
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.04537815354939391,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.04537815354939391
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4973544973544973,
"acc_stderr": 0.025750949678130387,
"acc_norm": 0.4973544973544973,
"acc_norm_stderr": 0.025750949678130387
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5873015873015873,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.5873015873015873,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8548387096774194,
"acc_stderr": 0.020039563628053286,
"acc_norm": 0.8548387096774194,
"acc_norm_stderr": 0.020039563628053286
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.625615763546798,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.625615763546798,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.030874145136562094,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.030874145136562094
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880232,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880232
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360755,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360755
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7230769230769231,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.7230769230769231,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.02950286112895529,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.02950286112895529
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.02626502460827588,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.02626502460827588
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8917431192660551,
"acc_stderr": 0.013321348447611743,
"acc_norm": 0.8917431192660551,
"acc_norm_stderr": 0.013321348447611743
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.022535526352692705,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.022535526352692705
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7713004484304933,
"acc_stderr": 0.028188240046929203,
"acc_norm": 0.7713004484304933,
"acc_norm_stderr": 0.028188240046929203
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494732,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494732
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002158,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002158
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.018315891685625856,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.018315891685625856
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8748403575989783,
"acc_stderr": 0.011832954239305742,
"acc_norm": 0.8748403575989783,
"acc_norm_stderr": 0.011832954239305742
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8005780346820809,
"acc_stderr": 0.021511900654252552,
"acc_norm": 0.8005780346820809,
"acc_norm_stderr": 0.021511900654252552
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.016513676031179595,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.016513676031179595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.022733789405447593,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.022733789405447593
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8006430868167203,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.8006430868167203,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8580246913580247,
"acc_stderr": 0.01942026010943829,
"acc_norm": 0.8580246913580247,
"acc_norm_stderr": 0.01942026010943829
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5390070921985816,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.5390070921985816,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5404172099087353,
"acc_stderr": 0.012728446067669952,
"acc_norm": 0.5404172099087353,
"acc_norm_stderr": 0.012728446067669952
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8051470588235294,
"acc_stderr": 0.02406059942348742,
"acc_norm": 0.8051470588235294,
"acc_norm_stderr": 0.02406059942348742
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7973856209150327,
"acc_stderr": 0.01626105528374613,
"acc_norm": 0.7973856209150327,
"acc_norm_stderr": 0.01626105528374613
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.026358916334904014,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.026358916334904014
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646612,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646612
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32313341493268055,
"mc1_stderr": 0.016371836286454607,
"mc2": 0.48575008545428044,
"mc2_stderr": 0.014261044394633108
},
"harness|winogrande|5": {
"acc": 0.823993685872139,
"acc_stderr": 0.010703090882320708
},
"harness|gsm8k|5": {
"acc": 0.5754359363153905,
"acc_stderr": 0.013614835574956375
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_itsliupeng__Mixtral-8x7B-v0.1-top3 | [
"region:us"
] | 2023-12-24T20:19:28+00:00 | {"pretty_name": "Evaluation run of itsliupeng/Mixtral-8x7B-v0.1-top3", "dataset_summary": "Dataset automatically created during the evaluation run of model [itsliupeng/Mixtral-8x7B-v0.1-top3](https://huggingface.co/itsliupeng/Mixtral-8x7B-v0.1-top3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_itsliupeng__Mixtral-8x7B-v0.1-top3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-24T20:17:11.492534](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__Mixtral-8x7B-v0.1-top3/blob/main/results_2023-12-24T20-17-11.492534.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7170056917230552,\n \"acc_stderr\": 0.03001643545450993,\n \"acc_norm\": 0.721542301890713,\n \"acc_norm_stderr\": 0.030593708007455218,\n \"mc1\": 0.32313341493268055,\n \"mc1_stderr\": 0.016371836286454607,\n \"mc2\": 0.48575008545428044,\n \"mc2_stderr\": 0.014261044394633108\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6356655290102389,\n \"acc_stderr\": 0.014063260279882419,\n \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.013697432466693244\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6715793666600279,\n \"acc_stderr\": 0.004686789042445367,\n \"acc_norm\": 0.8662617008564031,\n \"acc_norm_stderr\": 0.0033967530276634164\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n \"acc_stderr\": 0.03885004245800254,\n \"acc_norm\": 0.7185185185185186,\n \"acc_norm_stderr\": 0.03885004245800254\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7735849056603774,\n \"acc_stderr\": 0.025757559893106734,\n \"acc_norm\": 0.7735849056603774,\n \"acc_norm_stderr\": 0.025757559893106734\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8541666666666666,\n \"acc_stderr\": 0.029514245964291766,\n \"acc_norm\": 0.8541666666666666,\n \"acc_norm_stderr\": 0.029514245964291766\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.03414014007044036,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.03414014007044036\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.049665709039785295,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.049665709039785295\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7191489361702128,\n \"acc_stderr\": 0.029379170464124825,\n \"acc_norm\": 0.7191489361702128,\n \"acc_norm_stderr\": 0.029379170464124825\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4973544973544973,\n \"acc_stderr\": 0.025750949678130387,\n \"acc_norm\": 0.4973544973544973,\n \"acc_norm_stderr\": 0.025750949678130387\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5873015873015873,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.5873015873015873,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8548387096774194,\n \"acc_stderr\": 0.020039563628053286,\n \"acc_norm\": 0.8548387096774194,\n \"acc_norm_stderr\": 0.020039563628053286\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.625615763546798,\n \"acc_stderr\": 0.03405155380561952,\n \"acc_norm\": 0.625615763546798,\n \"acc_norm_stderr\": 0.03405155380561952\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.030874145136562094,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.030874145136562094\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880232,\n \"acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880232\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360755,\n \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360755\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7230769230769231,\n \"acc_stderr\": 0.022688042352424994,\n \"acc_norm\": 0.7230769230769231,\n \"acc_norm_stderr\": 0.022688042352424994\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37407407407407406,\n \"acc_stderr\": 0.02950286112895529,\n \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.02950286112895529\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.02626502460827588,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.02626502460827588\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8917431192660551,\n \"acc_stderr\": 0.013321348447611743,\n \"acc_norm\": 0.8917431192660551,\n \"acc_norm_stderr\": 0.013321348447611743\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.03309682581119035,\n \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.03309682581119035\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7713004484304933,\n \"acc_stderr\": 0.028188240046929203,\n \"acc_norm\": 0.7713004484304933,\n \"acc_norm_stderr\": 0.028188240046929203\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494732,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494732\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.859504132231405,\n \"acc_stderr\": 0.03172233426002158,\n \"acc_norm\": 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002158\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n \"acc_stderr\": 0.018315891685625856,\n \"acc_norm\": 0.9145299145299145,\n \"acc_norm_stderr\": 0.018315891685625856\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8748403575989783,\n \"acc_stderr\": 0.011832954239305742,\n \"acc_norm\": 0.8748403575989783,\n \"acc_norm_stderr\": 0.011832954239305742\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8005780346820809,\n \"acc_stderr\": 0.021511900654252552,\n \"acc_norm\": 0.8005780346820809,\n \"acc_norm_stderr\": 0.021511900654252552\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n \"acc_stderr\": 0.016513676031179595,\n \"acc_norm\": 0.4212290502793296,\n \"acc_norm_stderr\": 0.016513676031179595\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.022733789405447593,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.022733789405447593\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8006430868167203,\n \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.8006430868167203,\n \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8580246913580247,\n \"acc_stderr\": 0.01942026010943829,\n \"acc_norm\": 0.8580246913580247,\n \"acc_norm_stderr\": 0.01942026010943829\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5390070921985816,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.5390070921985816,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5404172099087353,\n \"acc_stderr\": 0.012728446067669952,\n \"acc_norm\": 0.5404172099087353,\n \"acc_norm_stderr\": 0.012728446067669952\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8051470588235294,\n \"acc_stderr\": 0.02406059942348742,\n \"acc_norm\": 0.8051470588235294,\n \"acc_norm_stderr\": 0.02406059942348742\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7973856209150327,\n \"acc_stderr\": 0.01626105528374613,\n \"acc_norm\": 0.7973856209150327,\n \"acc_norm_stderr\": 0.01626105528374613\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.026358916334904014,\n \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.026358916334904014\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646612,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646612\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32313341493268055,\n \"mc1_stderr\": 0.016371836286454607,\n \"mc2\": 0.48575008545428044,\n \"mc2_stderr\": 0.014261044394633108\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.823993685872139,\n \"acc_stderr\": 0.010703090882320708\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5754359363153905,\n \"acc_stderr\": 0.013614835574956375\n }\n}\n```", "repo_url": "https://huggingface.co/itsliupeng/Mixtral-8x7B-v0.1-top3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|arc:challenge|25_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|gsm8k|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hellaswag|10_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-24T20-17-11.492534.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["**/details_harness|winogrande|5_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-24T20-17-11.492534.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_24T20_17_11.492534", "path": ["results_2023-12-24T20-17-11.492534.parquet"]}, {"split": "latest", "path": ["results_2023-12-24T20-17-11.492534.parquet"]}]}]} | 2023-12-24T20:19:47+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of itsliupeng/Mixtral-8x7B-v0.1-top3
Dataset automatically created during the evaluation run of model itsliupeng/Mixtral-8x7B-v0.1-top3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-24T20:17:11.492534(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of itsliupeng/Mixtral-8x7B-v0.1-top3\n\n\n\nDataset automatically created during the evaluation run of model itsliupeng/Mixtral-8x7B-v0.1-top3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T20:17:11.492534(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of itsliupeng/Mixtral-8x7B-v0.1-top3\n\n\n\nDataset automatically created during the evaluation run of model itsliupeng/Mixtral-8x7B-v0.1-top3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-24T20:17:11.492534(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of itsliupeng/Mixtral-8x7B-v0.1-top3\n\n\n\nDataset automatically created during the evaluation run of model itsliupeng/Mixtral-8x7B-v0.1-top3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-24T20:17:11.492534(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.